Feed aggregator

A few useful Oracle 12cR2 MOS Docs

Syed Jaffar - Thu, 2017-07-06 07:33
A few useful MOS Docs are listed below , in case if 12cR2 upgrade around the corner.



  • How to Upgrade to/Downgrade from Grid Infrastructure 12.2 and Known Issues (Doc ID 2240959.1)
  • Complete Checklist for Upgrading to Oracle Database 12c Release 2 (12.2) using DBUA (Doc ID 2189854.1)
  • 12.2 Grid Infrastructure Installation: What's New (Doc ID 2024946.1)
  • Patches to apply before upgrading Oracle GI and DB to 12.2.0.1 (Doc ID 2180188.1)
  • Differences Between Enterprise, Standard Edition 2 on Oracle 12.2 (Doc ID 2243031.1)
  • 12.2 gridSetup.sh Does Not List Disks Unless the Discovery String is Provided (Doc ID 2244960.1)


Create a 12c physical standby database on ODA X5-2

Amis Blog - Thu, 2017-07-06 07:06

ODA X5-2 simplifies and speeds up the creation of a 12c database quite considerably with oakcli. You can take advantage of this command by also using it in the creation of physical standby databases as I discovered when I had to setup Dataguard on as many as 5 production and 5 acceptance databases within a very short time.

I used the “oakcli create database …” command to create both primary and standby databases really fast and went on from there to setup a Dataguard Bbroker configuration in max availability mode. Where you would normally duplicate a primary database on to a skeleton standby database that’s itself without any data or redo files and starts up with a pfile, working with 2 fully configured databases is a bit different. You do not have to change a db_unique_name after the RMAN duplicate, which proved to be quite an advantage, and the duplicate itself doesn’t have to address any spfile adaptations because it’s already there. But you may get stuck with some obsolete data and redo files of the original standby database that can fill up the filesystem. However, as long as you remove these files in time, just before the RMAN duplicate, this isn’t much of an issue.

What I did to create 12c primary database ABCPRD1 on one ODA and physical standby database ABCPRD2 on a second ODA follows from here. Nodes on oda1 are oda10 and oda11, nodes on oda2 are oda20 and oda21. The nodes I will use are oda10 and oda20.

-1- Create parameterfile on oda10 and oda20
oakcli create db_config_params -conf abcconf
-- parameters:
-- Database Block Size  : 8192
-- Database Language    : AMERICAN
-- Database Characterset: WE8MSWIN1252
-- Database Territory   : AMERICA
-- Component Language   : English
-- NLS Characterset     : AL16UTF16
file is saved as: /opt/oracle/oak/install/dbconf/abcconf.dbconf

-2- Create database ABCPRD1 on oda10 and ABCPRD2 on oda20
oda10 > oakcli create database -db ABCPRD1 -oh OraDb12102_home1 -params abcconf
oda20 > oakcli create database -db ABCPRD2 -oh OraDb12102_home1 -params abcconf
-- Root  password: ***
-- Oracle  password: ***
-- SYSASM  password - During deployment the SYSASM password is set to 'welcome1 - : ***
-- Database type: OLTP
-- Database Deployment: EE - Enterprise Edition
-- Please select one of the following for Node Number >> 1
-- Keep the data files on FLASH storage: N
-- Database Class: odb-02  (2 cores,16 GB memory)

-3- Setup db_name ABCPRD for both databases... this is a prerequisite for Dataguard
oda10 > sqlplus / as sysdba
oda10 > shutdown immediate;
oda10 > startup mount
oda10 > ! nid TARGET=sys/*** DBNAME=ABCPRD SETNAME=YES
oda10 > Change database name of database ABCPRD1 to ABCPRD? (Y/[N]) => Y
oda10 > exit

oda20 > sqlplus / as sysdba
oda20 > shutdown immediate;
oda20 > startup mount
oda20 > ! nid TARGET=sys/*** DBNAME=ABCPRD SETNAME=YES
oda20 > Change database name of database ABCPRD2 to ABCPRD? (Y/[N]) => Y
oda20 > exit

-4- Set db_name of both databases in their respective spfile as well as ODA cluster,
    and reset the db_unique_name after startup back from ABCPRD to ABCPRD1|ABCPRD2
oda10 > sqlplus / as sysdba    
oda10 > startup mount
oda10 > alter system set db_name=ABCPRD scope=spfile;
oda10 > alter system set service_names=ABCPRD1 scope=spfile;
oda10 > ! srvctl modify database -d ABCPRD1 -n ABCPRD
oda10 > shutdown immediate
oda10 > startup
oda10 > alter system set db_unique_name=ABCPRD1 scope=spfile;
oda10 > shutdown immediate;
oda10 > exit

oda20 > sqlplus / as sysdba    
oda20 > startup mount
oda20 > alter system set db_name=ABCPRD scope=spfile;
oda20 > alter system set service_names=ABCPRD2 scope=spfile;
oda20 > ! srvctl modify database -d ABCPRD2 -n ABCPRD
oda20 > shutdown immediate
oda20 > startup
oda20 > alter system set db_unique_name=ABCPRD2 scope=spfile;
oda20 > shutdown immediate;
oda20 > exit

-5- Startup both databases from the cluster.
oda10 > srvctl start database -d ABCPRD1
oda20 > srvctl start database -d ABCPRD2

Currently, 2 identical configured databases are active with the same db_name, which is a first condition for the following configuration of Dataguard Broker. By just matching the db_name between databases and keeping the db_unique_name as it was, ASM database and diagnostic directory names remain as they are.

Also, the spfile entry in the cluster continues to point to the correct directory and file, as well as the init.ora in $ORACLE_HOME/dbs. Because the standby started with an existing and correctly configured spfile you no longer need to retrieve it from the primary. It simplifies and reduces the RMAN duplicate code to just a one line command, apart from login and channel allocation.

-6- Add Net Service Names for ABCPRD1 and ABCPRD2 to your tnsnames.ora on oda10 and oda20
ABCPRD1_DGB =
  (DESCRIPTION =
    (ADDRESS_LIST =
      (ADDRESS = (PROTOCOL = TCP)(HOST = oda10)(PORT = 1521))
    )
    (CONNECT_DATA =
      (SERVICE_NAME = ABCPRD1_DGB)
    )
  )

ABCPRD2_DGB =
  (DESCRIPTION =
    (ADDRESS_LIST =
      (ADDRESS = (PROTOCOL = TCP)(HOST = oda20)(PORT = 1521))
    )
    (CONNECT_DATA =
      (SERVICE_NAME = ABCPRD2_DGB)
    )
  )

-7- Add as a static service to listener.ora on oda10 and oda20
oda10 > SID_LIST_LISTENER =
oda10 >   (SID_LIST =
oda10 >     (SID_DESC =
oda10 >       (GLOBAL_DBNAME = ABCPRD1_DGB)
oda10 >       (ORACLE_HOME = /u01/app/oracle/product/12.1.0.2/dbhome_1)
oda10 >       (SID_NAME = ABCPRD1)
oda10 >     ) 
oda10 >   )        

oda20 > SID_LIST_LISTENER =
oda20 >   (SID_LIST =
oda20 >     (SID_DESC =
oda20 >       (GLOBAL_DBNAME = ABCPRD2_DGB)
oda20 >       (ORACLE_HOME = /u01/app/oracle/product/12.1.0.2/dbhome_1)
oda20 >       (SID_NAME = ABCPRD2)
oda20 >     ) 
oda20 >   )

-8- Restart listener from cluster on oda10 and oda20
oda10 > srvctl stop listener
oda10 > srvctl start listener

oda20 > srvctl stop listener
oda20 > srvctl start listener

-9- Create 4 standby logfiles on oda10 only (1 more than nr. of redologgroups and each with just 1 member)
    The RMAN duplicate takes care of the standby logfiles on oda20, so don't create them there now
oda10 > alter database add standby logfile thread 1 group 4 size 4096M;
oda10 > alter database add standby logfile thread 1 group 5 size 4096M;
oda10 > alter database add standby logfile thread 1 group 6 size 4096M;
oda10 > alter database add standby logfile thread 1 group 7 size 4096M;
oda10 > exit

-10- Start RMAN duplicate from oda20
oda20 > srvctl stop database -d ABCPRD2
oda20 > srvctl start database -d ABCPRD2 -o nomount
oda20 > *****************************************************************************
oda20 > ********* !!! REMOVE EXISTING DATA EN REDO FILES OF ABCPRD2 NOW !!! *********
oda20 > *****************************************************************************
oda20 > rman target sys/***@ABCPRD1 auxiliary sys/***@ABCPRD2
oda20 > .... RMAN> 
oda20 > run {
oda20 > allocate channel d1 type disk;
oda20 > allocate channel d2 type disk;
oda20 > allocate channel d3 type disk;
oda20 > allocate auxiliary channel stby1 type disk;
oda20 > allocate auxiliary channel stby2 type disk;
oda20 > duplicate target database for standby nofilenamecheck from active database;
oda20 > }
oda20 > exit

And there you are… primary database ABCPRD1 in open read-write mode and standby database ABCPRD2 in mount mode. The only thing left to do now is the dataguard broker setup, and activate flashback and force_logging on both databases.

-11- Setup broker files in shared storage (ASM) and start brokers on oda10 and oda20
oda10 > sqlplus / as sysdba
oda10 > alter system set dg_broker_config_file1='/u02/app/oracle/oradata/datastore/.ACFS/snaps/ABCPRD1/ABCPRD1/dr1ABCPRD1.dat' scope=both; 
oda10 > alter system set dg_broker_config_file2='/u02/app/oracle/oradata/datastore/.ACFS/snaps/ABCPRD1/ABCPRD1/dr2ABCPRD1.dat' scope=both;
oda10 > alter system set dg_broker_start=true scope=both;
oda10 > exit

oda20 > sqlplus / as sysdba
oda20 > alter system set dg_broker_config_file1='/u02/app/oracle/oradata/datastore/.ACFS/snaps/ABCPRD2/ABCPRD1/dr1ABCPRD2.dat' scope=both; 
oda20 > alter system set dg_broker_config_file2='/u02/app/oracle/oradata/datastore/.ACFS/snaps/ABCPRD2/ABCPRD1/dr2ABCPRD2.dat' scope=both;
oda20 > alter system set dg_broker_start=true scope=both;
oda20 > exit

-12- Create broker configuration from oda10
oda10 > dgmgrl sys/***
oda10 > create configuration abcprd as primary database is abcprd1 connect identifier is abcprd1_dgb;
oda10 > edit database abcprd1 set property StaticConnectIdentifier='(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=oda10)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=ABCPRD1_DGB)(INSTANCE_NAME=ABCPRD1)(SERVER=DEDICATED)))';
oda10 > add database abcprd2 as connect identifier is abcprd2_dgb maintained as physical;
oda10 > edit database abcprd2 set property StaticConnectIdentifier='(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=oda20)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=ABCPRD2_DGB)(INSTANCE_NAME=ABCPRD2)(SERVER=DEDICATED)))';
oda10 > enable configuration;
oda10 > edit database abcprd2 set state=APPLY-OFF;
oda10 > exit

-13- Enable flashback and force logging on both primary and standby database
oda10 > sqlplus / as sysdba
oda10 > alter database force logging;
oda10 > alter database flashback on;
oda10 > exit

oda20 > sqlplus / as sysdba
oda20 > alter database force logging;
oda20 > alter database flashback on;
oda20 > exit
oda20 > srvctl stop database -d abcprd2
oda20 > srvctl start database -d abcprd2 -o mount

oda10 > srvctl stop database -d abcprd1
oda10 > srvctl start database -d abcprd1

-14- Configure max availability mode from oda10
oda10 > dgmgrl sys/*** 
oda10 > edit database abcprd2 set state=APPLY-ON;
oda10 > edit database abcprd1 set property redoroutes='(LOCAL : abcprd2 SYNC)';
oda10 > edit database abcprd2 set property redoroutes='(LOCAL : abcprd1 SYNC)';
oda10 > edit configuration set protection mode as maxavailability;
oda10 > show database abcprd1 InconsistentProperties;
oda10 > show database abcprd2 InconsistentProperties;
oda10 > show configuration
oda10 > validate database abcprd2;
oda10 > exit

You should now have a valid 12c Max Availability Dataguard configuration, but you better test it thoroughly with
some switchovers and a failover before taking it into production. Have fun!

The post Create a 12c physical standby database on ODA X5-2 appeared first on AMIS Oracle and Java Blog.

Finance and HR Leaders Shape Digital Disruption, New Research Finds

Oracle Press Releases - Thu, 2017-07-06 07:00
Press Release
Finance and HR Leaders Shape Digital Disruption, New Research Finds New Oracle and MIT Technology Review study reveals the human drivers of cloud automation as the roles of finance, HR, and IT evolve to meet the needs of a more connected organization

Redwood Shores, Calif.—Jul 6, 2017

To enable organizations to thrive in a competitive digital marketplace, Oracle and the MIT Technology Review – an independent media company founded at the Massachusetts Institute of Technology (MIT) in 1899—today released a new study that highlights the importance of collaboration between finance and human resources (HR) teams with a unified cloud. The study, Finance and HR: The Cloud’s New Power Partnership, outlines how a holistic view into finance and HR information, delivered via cloud technology, empowers organizations to better manage continuous change.  

Based on a global survey of 700 C-level executives and finance, HR, and IT managers, the study found that a shared finance and HR cloud system is a critical component of successful cloud transformation initiatives. Among the benefits of integrating enterprise resource planning (ERP) and human capital management (HCM) systems is easier tracking and forecasting of employee costs for budgeting purposes. Additionally, integrated HCM and ERP cloud systems improve collaboration between departments, with 37 percent of respondents noting that they use the cloud to improve the way data is shared.

The report also reveals the human factors behind a successful cloud implementation, with employees’ ability to adapt to change standing out as critical. Among organizations that have fully deployed the cloud, almost half (46 percent) say they have seen their ability to reshape or resize the organization improve significantly—as do 47 percent of C-level respondents.

The productivity benefits have also been significant. Nearly one-third of respondents (31 percent) say they spend less time doing manual work within their department as a result of moving to the cloud and that the automation of processes has freed up time to work toward larger strategic priorities.

“As finance and HR increasingly lead strategic organizational transformation, ROI comes not only with financial savings for the organization, but also from the new insights and visibility into the business HR and finance gain with the cloud. People are at the heart of any company’s success and this is why we are seeing finance and HR executives lead cloud transformation initiatives,” said Steve Cox, group vice president, ERP EPM product marketing at Oracle. “In addition, improved collaboration between departments enables organizations to manage the changes ahead and sets the blueprint for the rest of the organization’s cloud shift.”

The survey also reveals there is a blurring of lines between functions and individual roles as the cloud increasingly ties back office systems together:

  • Increased Collaboration: 46 percent of finance and HR professionals say a full cloud deployment has led to significantly improved collaboration between departments, and nearly half expect a significant improvement in the next two years.
    • This extends to IT as well. 52 percent of C-level respondents said the relationship between IT, HR and finance is even better than expected following their cloud implementation
  • Cross-Corporate Intermingling: With the new roles of HR and finance professionals requiring them to work more closely with data and the cloud, 43 percent of businesses plan to bring IT people into these departments to help employees take advantage of new technologies.
  • New Skillsets: Desired skills respondents want to improve upon include:
    • Time management, with 40 percent saying this is currently an issue
    • Active learning
    • Problem solving, mathematical reasoning and analytical skills
    • The IT function also changes. Post-deployment, 56 percent of C-level respondents report that IT has significantly improved when it comes to producing innovations.

Cox added: “As organizations navigate technological changes, it’s critical for the C-suite to empower its employees to evolve their individual business acumen. Many businesses understand this and it’s encouraging to see 42 percent planning to provide their teams with management skills training to help them break out of their traditional back-office roles. The learnings from the move of finance and HR to the cloud will ultimately spread across the organization as, together, they conceptualize the shape of the next disruption.”

Contact Info
Joann Wardrip
Oracle
+1.650.607.1343
joann.wardrip@oracle.com
About the Research

Oracle partnered with the MIT Technology Review to survey HR, Finance and IT professionals about the state of their cloud transformation and to gain insight into how moving to the cloud has improved collaboration among teams. In total, 700 HR, Finance, and IT professionals were polled across North America, EMEA and Asia.

About Oracle

The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Talk to a Press Contact

Joann Wardrip

  • +1.650.607.1343

Oracle Clusterware 12cR2 - deprecated and desupported features

Syed Jaffar - Thu, 2017-07-06 04:27


Having clear understanding of deprecated and desupported features in a new release is equally important as knowing the new features of the release. In this short blog post, I would like to highlight the following features that are either deprecated or desupported in 12cR2.

Deprecated
·        config.shwill no longer be used for Grid configuration wizard, instead, the gridSetip.sh is used in 12cR2;
·        Placement of OCR and Voting files directly on a shared filesystem is not deprecated;
·        The diagcollection.pl utility is deprecated in favor of Oracle Trace File Analyzer;

Desupported
·        You are no longer able to use Oracle Clusterware commands that are prefixed with crs_.


In my next blog post, will go over some of the important features Oracle Clusterware in 12cR2. Stay tuned.

References:

Webcast: Getting Optimal Performance from Oracle E-Business Suite

Steven Chan - Thu, 2017-07-06 02:00

Oracle University has many free recorded webcasts that are useful for E-Business Suite system administrators.  Here's a good one on EBS performance tuning (this is always one of our most-popular sessions at OpenWorld):

Samer Barakat, Director Application Performance summarizes practical tips and lessons learned from performance-tuning and benchmarking the world’s largest Oracle E-Business Suite environments. Application system administrators will get concrete tips and techniques for identifying and resolving performance bottlenecks on all layers of the technology stack. They will also learn how Oracle’s engineered systems such as Oracle Exadata and Oracle Exalogic can dramatically improve the performance of their system. This material was presented at Oracle OpenWorld 2015.

Related Articles

Categories: APPS Blogs

Working with Location and Permissions in JET Hybrid

Andrejus Baranovski - Thu, 2017-07-06 00:35
What if you want to access mobile device location data from JET Hybrid application? This can be achieved with Cordova Geolocation plugin. But you want it to be nicely done and want to make sure application is granted with permission to access location information. Use Cordova Permissions plugin for that.

You could add Cordova plugin to JET app by executing this command:

cordova plugin add 

If this command doesnt work for any reason, you could add plugin information directly into config.xml file (check Geertjan post about the same - Plugging into Devices with Oracle JET on Cordova (Part 1)):



In JS function, before calling location API - we call permissions API to check if app is already granted permission to read location data. In hasPermission method, in case of success - location data is accessed. In case of no permission, request for permission is sent. If request is satisfied - location is accessed (and permission is granted at the same time):


Location data is retrieved through callback:


This is how it works. On very first location access, when permission is not granted yet - we request permission through permission API:


When permission is granted, location is displayed:


Download sample application from GitHub repository - rslocationapp.

Virtualization on the Oracle Database Appliance S, M, L

Amis Blog - Wed, 2017-07-05 15:44

One of the great advantages of the Oracle database Appliance HA is the possibility of Virtualization through OracleVM. This virtualization wasn’t possible for the other members of the Oracle Database Appliance. Until now.

In the patch 12.1.2.11.0 which has been released recently for the ODA S,M and L, virtualization is possible… through KVM. Is this a shocking change? No, KVM is part of Linux for more than 10 years now. Desirable? Yes, I think so, and worthwhile for give it a bit of attention in this blogpost.

You can read a very, very short announcement in the documentation of the Oracle Database Appliance.

Oracle has promised more information (including step-by-step guide) will be released very soon.

When installing the patch, the Oracle Linux KVM will be installed, and there’s no need for re-imaging your system like the Oracle Database Appliance HA. When using KVM it’s possible to run applications on the ODA S,M and L , and in that way isolate the databases from the application in matter of life cycle management.

In my opinion this could be a great solution for some customers for consolidating their software and for ISV’s for creating a solution in a box.

 

But… ( there’s always a but) as I understand – haven’t tested it yet – there are a few limitations:

– You may only use the Linux O.S. on the guest VM

– There’s no support for installing an Oracle database on the guest VM

– Related to that, there’s no capacity-on-demand for databases or applications in the guestVM

 

So the usability of this new feature may seem limited for now, but testing and using the feature has just begun!

The next big release will be in Feb/March 2018:

  • Databases in the VM’s
  • Each database will be running in its own VM
  • VM hard-partitioning support for licensing
  • Windows support

I’m very curious how Oracle will handle the standardization in the Oracle Database Appliance family in the future:

– ODACLI versus OAKCLI

– OracleVM versus KVM

– Web console user interface vs command-line

Will it merge and if it will, in what direction. Or  will a new rising technology take the lead.

 

Regardz.

 

Resources:

Oracle Database Appliance Documentation: https://docs.oracle.com/cd/E86648_01/doc.121/e85591/managing-database-appliance-virtualized-platform.htm#GUID-971B6555-B1A6-4500-8187-C085989F25A9

The post Virtualization on the Oracle Database Appliance S, M, L appeared first on AMIS Oracle and Java Blog.

Chatbot Challenges: Talking Tech About Talking Tech

OTN TechBlog - Wed, 2017-07-05 13:53
Despite the dizzying evolution of computer technology over the last three decades, the input/output process remains deeply dependent on fingers. True, I can now use my iPhone or Amazon Echo to request weather reports and ask bizarre questions of Turing Test candidates. However, most of my time spent interacting with a computer still requires my not-always-cooperative fingers. But those days are numbered, thanks to chatbots.

As with other innovations, there’s reality amid the chatbot hype, as I discovered when I posted several chatbot-related questions in a community forum. Among the responses were two accounts that prove that when it comes to chatbots, it’s not just talk.

Oracle ACE Leon Smiers, Center of Excellence lead for Oracle PaaS at Capgemini, based in the Netherlands, reports that his organization has already implemented chatbot projects using a variety of technologies, and has recently started a new chatbot project for the Dutch police that will make use of Oracle Technology.

According to Smiers, designing a chatbot project differs considerably from designing a standard user interface for a Web application or a mobile application. “The focus is on conversation rather than on transaction,” Smiers says.  “Chatbots are another channel in the interaction with customer, with the primary focus on delivering a fast track for providing answers or enabling transactions.” 

Smiers cites two unique challenges in developing chatbots. The first, and this should come as no surprise, is the complexity of language. As he explains, there are a variety of ways one might report that one’s bicycle was stolen. “My tandem is gone,” Smiers suggests, or “somebody nicked my iron horse,” amid nearly infinite variations. “With bots you need to able to understand the intent of a sentence, in singularity and in context of a conversation,” Smiers explains.

The other challenge? The sky-high expectations of end users and companies. “Chatbots are expected to deliver neuro-linguistic programming [NLP] and artificial intelligence in every possible way,” Smiers explains. “So we need to tone down the user expectations.” To that end Smiers’ team has created a Chatbot Maturity model, “to provide a roadmap and make it clear what can be expected in the first and consequent releases.”

Smiers and his team rely on a variety of tools and technologies to support chatbot development, including Apache Open NLP and Stanford NLP. “These can be implicitly available in tooling, such as the Intelligent Bot Services, but also explicitly used when calls via API’s are made to enrich the conversation,” Smiers explains. “These API calls need to be fed with the proper information from the question sentence in order to provide the right answer.”

Oracle ACE John Sim, a consultant with Fishbowl Solutions in the UK, reports that the Fishbowl team has built Atlas, a chatbot that was initially developed as an entry in Fishbowl’s annual hackathon event. “We are now using it internally to help to build up its intelligence through machine learning and natural language understanding capabilities,” Sim says. “Once we feel Atlas has built up enough wisdom, we plan to release him as an offering with integrations for SaaS and PaaS applications.”

The Fishbowl team did its homework before kicking off the Atlas project. “Our goal from the beginning was to make an intelligent bot and not simply an if-then-else bot,” Sim says. “We also wanted to write a user flow and look at common tasks where we could abstract the user interface and do the interactions through user intents. Our intent here was to enhance the user experience to quickly access content and get tasks completed effectively without going through a click form filling process stream.”

The Atlas project was fun, according to Sim, but it wasn’t without challenges. “The biggest ones were multilingual support and connecting user credentials so that Atlas only retrieves relevant info for the user,” Sim says. 

And the work continues. “We are now looking at areas where we can enhance Atlas, such as an administrative interface to allow users to review mapped credentials and his offerings and capabilities from outside of the chat window on Facebook and Slack,” Sim says. “We are excited about the future for Atlas and the use cases for bots overall. Who knows, bots could replace mobile apps. We shall see.”

How do chatbots figure in your future? Are you doing your homework to prepare to meet the challenges of this revolution in human-computer interaction? Post your comments below.

Additional Resources

Storage Server: FreeNAS: use your SSD efficiently

Frank van Bortel - Wed, 2017-07-05 12:13
FreeNAS: use your SSD efficiently ZIL and Cache Not open for discussion; I think it is a complete waste of resources to use a 120, or 250GB SSD for logs, let alone cache, as FreeNAS will (and should!) use RAM for that. So, I searched and found a way to create two partitions on a single SSD, and expose these as ZIL (ZFS Intended Log) and cache to the pool.Mind you - there are performance testsFrankhttp://www.blogger.com/profile/07830428804236732019noreply@blogger.com0

BI tools for Data Driven Culture

Nilesh Jethwa - Wed, 2017-07-05 12:05

Modern day technology has brought about new business practices that have not only made industries more efficient but more competitive as well. People these days live in what is known as a data-driven environment where instinct or luck does not run the business culture, but by honest-to-goodness facts.

There was a time when business intelligence tools served the purpose of providing industry insights, information on consumer behavior, and even consumption patterns. These days, they do more than just provide tools for effective pre-operational business analysis.

BI tools are currently at the heart of multiple organizations where they use technology to make better operating decisions. And what once was information reserved for the higher ranks is now information accessible to all members of the staff, especially those working at the front lines. It is with this small change that businesses in a data-driven culture can truly differentiate the customer experience.

Some of the most important things that business owners need to do are identify which market forces will play a role in their success and focus on which economic components will help them achieve their goals. Given that there are components of modern commerce that are unpredictable the playing field is evened out thanks to BI tools.

BI tools consist of a series of technologies that make it simpler and more efficient for companies that use them to gather data and analyze them. It can include everything from sales metrics to consumer spending habits or even provide accurate measurements about the effectiveness of an operating strategy.

Read more at http://www.infocaptor.com/dashboard/what-is-data-driven-culture-and-how-can-bi-tools-help

In Memoriam

Jonathan Lewis - Wed, 2017-07-05 11:19

My mother died a few weeks ago after a couple of months in terminal care. One of my tasks while she was in care was to go through all her paperwork and while doing so I discovered a couple of stories from her past that she had typed out (remember type-writers?) about 30 years ago. I typed them up on my laptop and printed several copies to hand out for people to read at the tea-party we held for her old friends – of all ages, ranging from 15 to 99 – after the funeral; this seemed to give them a lot of pleasure and they found them so interesting that I decided to share them with a larger audience. So here’s the story, written by my mother in 1983, of her evacuation experience at the start of the 2nd world war when she was a little over 14 years old.

The Summer of 1939. Reminiscences of Dorothy Kathleen Lewis (1925 – 2017)

There had been a lot of talk about a war coming. Adolf Hitler and his armies had marched into Austria and were threatening Poland. We had all been issued with gas masks – just in case we would need them – and emergency plans had been made to evacuate all children from the big cities.

During the school holidays I was taken by my parents, with my sister, to my mother’s home village of Llangeitho in Cardiganshire. My mother had a cousin who was a retired school teacher and it was arranged with Auntie Jane that if war broke out Peggy and I would be sent to stay with her. I don’t think we were very pleased with the arrangement because to us she was very old-fashioned, not a bit like our mother. We ended our holiday and went back to London to wait for the school term to begin.

On the 1st September we heard that all children from our school whose parents wanted them to be evacuated should assemble at the school gates with a small suitcase and their gas masks. As we had already been told we were going to Llangeitho if the war broke out we stood and watched all our friends walking in crocodile fashion down the street and mothers and fathers crying watching them go. It was a very sad day, but I wished I was going with them. I didn’t like the idea of staying with Auntie Jane. None of these children knew where they were going, just somewhere in the countryside for safety, and they didn’t know who would be looking after them.

Well, on the morning of 3rd September Neville Chamberlain, our prime minister, spoke on the wireless (we now call it a radio) to say that we were at war with Germany. Immediately the sirens went and everyone ran to the shelters. My parents, Peggy, and I went to Baker Street Station, which has very deep platforms. There were hundreds of people with the same thing on their minds. We all took our gas masks with us. After a short time the all-clear went. My father sent a telegram to Auntie Jane to say Peggy and I would be leaving London on the train at 9:25pm that night. Trains did not travel as fast as they do today and we were due to arrive at Pont Llanio Station at 7:30am on Monday morning. Peggy’s friend and her mother (an Italian lady who did not speak very good English) was coming too, also one of the young people from the village who was working in London.

Paddington Station had very dim lights and when we got on the train there were no lights at all. After a little while we children began to feel a bit less afraid and started to tell ghost stories and play memory games. It was fun going to the toilet on the train because there were people sitting in the corridor and so was their luggage. We could not see them and I don’t think we really tried – it was all a game. We were supposed to be sleeping, but we were too excited for that. When it came time to eat our sandwiches we had to taste them before we knew what we were eating. Can you imagine being in a train without any lights, and there were no lights in the streets or houses or on the station platforms that we passed. Names of stations had already been removed in case the country was invaded by the enemy. The belief was that the enemy would not know were he was if there were no road signs etc. No-one thought about them using maps and compasses as they would now. [ed: 1983]

We eventually arrived in a town called Carmarthen where we had to change trains and take a slow train to Pont Llanio where a car would meet us. Our train from Paddington was very late arriving and the slow train had gone. Someone telephoned Pont Llanio station to say we would be late and to send the car back. The train from Carmarthen was a very slow one and my father used to say “you could get out of the train and pick flowers on the bank and get back into the train again”. It really was very slow and chugged its way along the line. We arrived at last in Pont Llanio and then in Llangeitho after a journey of 16 hours. [ed: 4:30 to 5:00 hours driving time, now; 6 hours by public transport] I am sure we must have looked very dirty and untidy. The trains in those days were steam and there would be plenty of coal smuts flying around.

I did not think Auntie Jane would be very pleased to see us and I was soon to find out that I had thought rightly. The first thing she did was to take up the stair carpet in case we wore it out. I don’t know how she thought we would do that because once we came down in the morning we were not allowed to go back upstairs again until we went to bed. [ed: if you’ve read “Carrie’s War” you may recognise the behaviour]  She also did not know that children eat quite a lot too. For breakfast Auntie Jane would boil an egg and cut it in half, so Peggy and I had half each. And the same for our dinner, we would have two small potatoes – and this was before rationing and shortage of food. We had a lot of friends in the village and if it was not for them asking us out to tea and/or supper we would have been very hungry. Peggy went to school in the village, but I was too old [ed: at 14 yrs 4 months] and had nothing to do all day, but soon found a baby I could take out in the pram and that meant I would be asked if I would like a piece of cake and a drink. After a few weeks and a number of letters home things goT a little better because my mother was sending parcels of food to Auntie Jane. I don’t know what arrangements were made money wise; because we were not Government evacuees Auntie Jane would not have been paid by the authorities to keep us.

One of the things we used to do together with two of our friends was to help the local butcher clean out his slaughter-house after he had killed a beast. This meant he then asked us to supper in his old farm-house with a huge Inglenook fireplace. Another of my mother’s friends used to have us in early for a meal and say “don’t tell Auntie Jane or she will not give you anything else to eat”. I often think back on those days and wonder why she was so mean. She had never married and had children, but being a teacher I would have expected her to be more tolerant.

In December of 1939 Peggy wrote a letter home which was full of complaint and left it somewhere where Auntie Jane found it and this letter was sent to my parents with a letter from Auntie Jane asking that we be sent back to London. A lot of the people in the village were very surprised to think that she should think to send us back to London when there were air-raids (these had not started at that time). People were saying we would be going home to be killed, but as for me I would rather take that chance than be left in Llangeitho.

Going back to London wasn’t much fun – the school was closed so once again we were at a loose end. We stayed in London over Christmas and again the government started evacuating school children and in February we joined a group who were leaving London – this time as London School Evacuees. We were sent to Buckingham to a family with a little girl of 2 years. This seemed to be alright and we went to school in the afternoons whilst the local children went to school in the mornings. It got rather uncomfortable there after a while because the man of the house, aged 24, lost his job (I don’t know why) and there were a lot of arguments in the house. His wife did not make herself look smart and he started to pay too much attention to me. Again a letter home that it was time we left there and one morning my father arrived and said: “pack your bags, you’re coming home”. What joy!

I don’t have much memory about this part of being an evacuee except to say I was relieved to be out of that house and back in the safety of my family. Whilst we were in Buckingham there had been bombing in London and Peggy and I were taken to see some of the damage that had been done. I think this was to frighten us so that we would be willing to go away again. I certainly did not like the sirens going and having to stop what we were doing and go to the shelter[1]. Once again we were on the move and this time I have a very detailed memory of the events.

We were assembled at the school playground all with our cases and gas masks – worried children and even more worried parents and teachers. No one knew where we were going except that we all piled into a double-decker bus. Lots of tears this time because we knew that people were being killed and injured. Would we see our parents again? What was going to happen to us if they were killed? Where would we go, who would look after us? Questions, questions!

We were taken to Marylebone station and put on an underground train. Where was it going? What were mum and dad doing now; were they still blowing their noses? We were not so bothered because we still knew where we were. Next stop Paddington Station and hundreds of children milling about. I remember I was in the Junior Red Cross at that time and a Red Cross nurse saw my badge and came to speak to me. Such a little thing but it meant such a lot and I have never forgotten her kind words, saying I was going somewhere safe and would be alright. Maybe I was crying at the time, I don’t know.

As the train pulled out of Paddington Station we were all trying to get to a window to wave, although we didn’t know anybody and we didn’t know where we were going. Well of all places – we arrived in Banbury. Now my Auntie Kit, my father’s sister, only lived about 2 miles out of Banbury in a village called Bodicote. I knew Banbury well because we had often visited Bodicote. If only she knew I was here. I know we could not stay with her because she had a very small house and she had 4 children already.

Again on another bus and somehow Peggy and I and two other girls from my class got separated from the rest of our school and we were taken to a village called Great Rollright. Peggy and I went to stay with a lady called Mrs. Robinson who kept the village shop, and my two class friends went to a farm.

Mrs. Robinson was a kind lady – she had asked for two boys but somewhere along the line she had two girls instead. It was very strange arriving in Great Rollright. We were all taken into the Church Hall and there were village people there who, to us, had a funny way of speaking. And one after the other they were saying how many children they would take. Mrs. Robinson wasn’t there, so maybe that is why she didn’t get two boys. I thought it was very embarrassing to be standing there with these quaint country people whilst they were deciding whether they wanted one, two, or three children.

Our time with Mrs. Robinson was very happy. Peggy went to the village school and I went to the county school in Chipping Norton, again on a part-time basis. Mrs. Robinson had a pet cockerel which was allowed to roam round the house – I hated that bird and I think it knew it. Its name was Cocky. Every time I wanted to go down the garden to the toilet this bird would follow me and stay outside until I came out again and peck my legs as I raced back up the garden.

There was certainly plenty to eat in this house and we really had an enjoyable time there. We were always sent to bed with a small glass of cider. I never knew then that cider was an alcoholic drink and I thought it was lovely. We didn’t seem any the worse for it anyway.

We got involved with the village church and would have been happy to have stayed there. But doom. One day I came home from school to find my head mistress from London sitting at the table. Unbeknown to us she had been frantically looking for these four girls she had lost at Banbury Station. I don’t know how she had found us, whether she had contacted our parents or hunted through the schools in the area. With a surname like Walklett I don’t think we would have been difficult to find. I don’t think she had been in touch with our parents – what an awful thing to say to them: “I’m very sorry but I may have lost your children”. No, she must have hunted through the school registers.

The upshot of this visit was that she had found somewhere more suitable for us and would we pack our things because she had found somewhere else for us to stay. More tears because we liked Mrs. Robinson, and the village people were all lovely to us and we were invited to their homes. Off we went with Miss Attride in the car to another village called Duns Tew. The strange thing was that none of our school were there, so why were we moved yet again?

This time we stayed with Mr. and Mrs. Beck, his name was Harry and hers was Daisy, but they were Mr. and Mrs. Beck to us. Mr. Beck was a farm hand and he worked with horses. He used to plough the fields of the farm across the road. He must have walked miles in the days he was ploughing. Although I had had many holidays in Wales and Shropshire at haymaking time I knew nothing about ploughing.

Mr. and Mrs. Beck had a young man living with them. He was like their son; although his family lived in the village he had lived with the Becks since he was a baby and they called him their son. His name was Walter. The village was a beautiful place and we lived in No. 73. There were no street names, every house had a name and a number so we were at No. 73 Duns Tew, the last house in the village, a lovely old thatched cottage. There was always a big wood fire in the grate and plenty on the table. Mr. and Mrs. Beck were the nicest people in village.

Peggy now had to go to Steeple Aston School (since moving to Banbury in 1975 I have met the widow of her headmaster there), and I went to a Continuation College which had been evacuated from the East End of London. This was very odd to me – we were taught shorthand, typing, arithmetic, English grammar, French. This was obviously training us for the commercial world. I was much younger than the other girls there but my education was more advanced than theirs so I soon became top of the class. My English was always being complimented. What they didn’t know was that I had a Welsh mother and the Welsh language used very letter in the word. My French was well in advance and my Maths took a flying leap.

I made friends in the class. The class was held in The Hall, North Aston – a country seat. The Hall was so large that there were 9 girls living there and they had servants. The school room was in the Grand Hall and it was so beautiful it seemed a pity to me that there were desks etc. on the polished floor.

In Duns Tew we had one of the masters of the school staying in The Nurseries (which is still there) and every Friday evening the family he stayed with invited those of us in the village to spend the evening in their house and they had a piano so all the war songs were being sung: “Roll out the Barrel”, “We’re going to hang out the washing on the Siegfried line” and many more.

Because the school at North Aston was a long walk I bought a bike, something I had always wanted, and I joined the cycling group. This meant on one day a week we would go for an outing to places like Blenheim Palace [ed: 10 miles away] etc. I became a good cyclist and had plenty of energy when others flagged behind. I certainly made use of my bike.

One particularly happy time was when it snowed in the winter. Yes, we did get snow in London, but not like this. It was white[2] and where the wind blew it was as high as the hedgerows; I couldn’t believe what I saw. Walter the Beck’s son had a sledge and he showed us where it was good to use it. It was a fantastic time.

 

[Banbury, 1983]

 

[1] One of the stories about my mother that I first heard at her funeral was about the time she persuaded her parents to let her stay at home overnight. At the time the family used to head for the air-raid shelter (i.e. the local underground station) at the end of the day and stay there all night long. My mother hated this and persuaded her parents to let her stay at home in her own bed provided she promised to join them at the air-raid shelter as soon as the air-raid sirens sounded. She was only allowed to do this once – because she managed to sleep through two bombing runs and without being woken by the sirens or the explosions.

[2]If you’re wondering why white snow is worth mentioning you probably don’t know about the density of London smog at that time.


DA 7.2 UCF Transfer failing with SSL

Yann Neuhaus - Wed, 2017-07-05 08:14

This is a blog following the one I already wrote ealier:  https://blog.dbi-services.com/documentum-administrator-ucf-troubleshooting/
The first one was for “how to find the error” and not how to resolve it. In this blog I will talk about an error I got by a customer about UCF.

I got an error message using DA 7.2 where I coulnd’t download documents, in fact every transfer were failing due to UCF. By following my previous blog I found the specific error in the logs saying:
SSL Handshake failed.

You probably had this issue as well if you used SSL with DA. By default when you configure SSL with DA it tries to find a certificate from the java CA cert. You can add the certificate to your keystore to prevent this issue.
But in my case I had a keystore generated from certs certified by the customer authority. So I had to find another way.

I found the solution in the documentation: https://support.emc.com/docu56531_Documentum-Web-Development-Kit-6.8-Development-Guide.pdf?language=en_US at page 58.

You can deactivate the java validation as follow:
cd $CATALINA_HOME_DA
vi ./webapps/da/wdk/contentXfer/ucf.installer.config.xml

Add the following option:

<option name="https.host.validation" persistent="false">
<value>false</value>
</option>

Now restart tomcat or your application server and you will be able to transfer content.

 

Cet article DA 7.2 UCF Transfer failing with SSL est apparu en premier sur Blog dbi services.

Oracle Data Visualization Desktop v3

Rittman Mead Consulting - Wed, 2017-07-05 07:57
Oracle Data Visualization Desktop v3

The ODTUG Kscope17 conference last week in San Antonio was a great event with plenty of very interesting sessions and networking opportunities. Rittman Mead participated during the thursday deep dive BI session and delivered three sessions including a special "fishing" one.


. pic.twitter.com/jC04r4RNvx

— Andrew Fomin (@fomin_andrew) 28 giugno 2017

In the meantime Oracle released Data Visualization Desktop 12.2.3.0.0 which was presented in detail during Philippe Lions session and includes a set of new features and enhancements to already existing functionalities. Starting from new datasources, through new visualization options, in this post I'll go in detail on each of them.

Data Sources

The following new datasources have been introduced:

The latter two (still in beta) are very relevant since they enable querying any product directly exposing JDBC or ODBC connectors (like Presto) without needing to wait for the official support in the DVD list of sources.

Still in DVD v3 there is no support for JSON or XML files. In my older blog post I wrote how JSON (and XML) can be queried in DVD using Apache Drill, however this solution has Drill installation and knowledge as a prerequisite which is not always achievable in end users environment where self-service BI is happening. I believe future versions of DVD will address this problem by providing full support to both data sources.

Connection to OBIEE

One of the most requested new features is the new interface to connect to OBIEE: until DVD v2 only pre-built OBIEE analysis could be used as sources, with DVD v3 OBIEE Subject Areas are exposed making them accessible. The set of columns and filters can't be retrieved on the fly during the project creation but must be defined upfront during datasource definition. This feature avoids move back and forth from OBIEE to DVD to create an analysis in as datasource, and then use it in DVD.

Oracle Data Visualization Desktop v3

Another enhancement in the datasource definition is the possibility to change the column delimiter in txt sources, useful if the datasource has an unusual delimiters.

Oracle Data Visualization Desktop v3

Data Preparation

On the data-preparation side we have two main enhancements: the convert-to-date and the time grain level.
The convert-to-date feature enhances ability for columns to date conversion including the usage of custom parsing strings. Still this feature has some limits like not being able to parse dates like 04-January-2017 where the month name is complete. For this date format a two step approach, reducing the month-name and then converting, is still required.

Oracle Data Visualization Desktop v3

The second enhancement in the data preparation side is the time grain level and format, those options simplify the extraction of attributes (e.g. Month, Week, Year) from date fields which can now be done visually instead of writing logical SQL.

Oracle Data Visualization Desktop v3

The Dataflow component in DVD v3 has an improved UI with new column merge and aggregation functionalities which makes the flow creation easier. Its output can now be saved as Oracle database or Hive table eliminating the need of storing all the data locally.

Oracle Data Visualization Desktop v3

It's worth mentioning that Dataflow is oriented to self-service data management: any parsing or transformation happens on the machine where DVD is installed and its configuration options are limited. If more robust transformations are needed then proper ETL softwares should be used.

New Visualization Options

There are several enhancement on the visualization side, with the first one being the trendlines confidence levels which can be shown, with fixed intervals (90%, 95% or 99%)
Oracle Data Visualization Desktop v3

Top N and bottom N filtering has been added for each measure columns expanding the traditional "range" one.

Two new visualizations have also been included: waterfall and boxplot are now default visualizations. Boxplots were available as plugin in previous versions, however the five number summary had to be pre-calculated; in DVD v3 the summary is automatically calculated based on the definition of category (x-axis) and item (value within the category).

Oracle Data Visualization Desktop v3

Other new options in the data visualization area include: the usage of logarithmic scale for graphs, the type of interpolation line to use (straight, curved, stepped ...), and the possibility to duplicate and reorder canvases (useful when creating a BI story).

Oracle Data Visualization Desktop v3

Console

The latest set of enhancements regard the console: this is a new menu allowing end users to perform task like the upload of a plugin that before were done manually on the file system.

The new Oracle Analytics Store lists add-ins divided into categories:

  • PlugIn: New visualizations or enhancement to existing ones (e.g. auto-refresh, providing a similar behaviour to OBIEE's slider)
  • Samples: Sample projects showing detailed DVD capabilities
  • Advanced Analytics: custom R scripts providing non-default functionalities
  • Map Layers: JSON shape files that can be used to render custom maps data.

The process to include a new plugin into DVD v3 is really simple: after downloading it from the store, I just need open DVD's console and upload it. After a restart of the tool, the new plugin is available.

Oracle Data Visualization Desktop v3

The same applies for Map Layers, while custom R scripts still need to be stored under the advanced_analytics\script_repository subfolder under the main DVD installation folder.

As we saw in this blog post, the new Data Visualization Desktop release includes several enhancement bringing more agility in the data discovery with enhancements both in the connections to new sources (JDBC and ODBC) and standard reporting with OBIEE subject areas now accessible. The new visualizations, the Analytics Store and the plugin management console make the end user workflow extremely easy also when non-default features need to be incorporated. If you are interested in Data Visualization Desktop and want to understand how it can be proficiently used against any data source don't hesitate to contact us!

Categories: BI & Warehousing

Storage Server: Software - FreeNAS

Frank van Bortel - Wed, 2017-07-05 02:21
Software: FreeNAS All hardware has been implemented, all 13 harddisks and one SSD are connected, serial numbers, as well as physical, and logical locations noted. Cramming 4 2.5" disks in the CD bay Time to add some software. I will install the latest and the greatest(?) FreeNAS software, V11. Installation The installation, due to IPMI being capable of mounting remote images, is a Frankhttp://www.blogger.com/profile/07830428804236732019noreply@blogger.com0

Check Workflow's Health with Workflow Analyzer

Steven Chan - Wed, 2017-07-05 02:00

In addition to helping customers resolve issues via Service Requests, Oracle Support also builds diagnostic tools for Oracle E-Business Suite. Here's an index to all of these tools:

What do Support Analyzers do?

These Support Analyzers are non-invasive scripts that run health-checks on your EBS environments. They look for common issues and generate standardized reports summarizing that provide solutions for known issues and recommendations on best practices.

What areas do they cover?

There are over 60 different analyzers for EBS 11i, 12.0, 12.1, and 12.2. There are analyzers for the following product families:

  • E-Business Suite Core (Technology)
  • Financials
  • Manufacturing
  • Human Capital Management  
  • Customer Relationship Management  

Spotlight on Workflow Analyzer

If you haven't tried any of these tools, I'd strongly recommend getting started with the Workflow Analyzer:

The Workflow Analyzer reviews Workflow configurations and compares them against Oracle's best practices. The script reviews Workflow queues, runtime data, stuck activities, looping activities, and purging processes.

This tool can be run manually or configured to run as a concurrent request, so it can be scheduled to be run periodically and included in regular Workflow Maintenance cycles.

Can this script be run against Production?

Yes. There is no DML in the Workflow Analyzer Script (workflow_analyzer.sql), so it is safe to run against Production instances to get a current detailed overview of the Workflow environment for a specific instance. As always it is recommended to test all suggestions against a TEST instance before applying to Production.

References

Related Articles

 

Categories: APPS Blogs

report on form

Tom Kyte - Tue, 2017-07-04 22:06
Hi team, I created one table create table books (id number, book_name varchar2(256), published_date date); Insert into book(1,'HARRY POTTER','09-01-2010'); Insert into book(2,'HARRY POTTER 2','09-01-2011'); now I created a report on...
Categories: DBA Blogs

Changing Schema and Schema Object names

Tom Kyte - Tue, 2017-07-04 22:06
We have a project where we are planning change the schema name and the objects in schema Suppose Table Name is Schemaname_<TABLE_NAME> to have New Schema Suffixes same applies to packages Example Current Schema Name - SCOTT Objects in Sche...
Categories: DBA Blogs

Virtual Date Column as paritition doesn't work

Tom Kyte - Tue, 2017-07-04 22:06
Dear All, Database Version : Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit I have a table XX_FT with 2 column as below <code>REF_NO VARCHAR2(100) </code> This column can have two variations of data "XX17010XXXXXX" ...
Categories: DBA Blogs

Kill Session Revert / Miss some data on commit

Tom Kyte - Tue, 2017-07-04 22:06
Hi, I am new to Oracle. I have one Issue. Sometime user said that the application is hungup and they can't do anything. sometime they said they are unable to generate reports. For that I have studied your answers and I am using those to resolve th...
Categories: DBA Blogs

architecture

Tom Kyte - Tue, 2017-07-04 22:06
1)How an insert statement work internal of oracle architecture how data is going to stored in data block 2)How update statement works internal of oracle architecture how data is going to update a row
Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator