Feed aggregator

Archiving Partition tables size of 3TB+

Tom Kyte - 2 hours 31 min ago
Hi, I have two Fat Tables, Both are having Composite RANGE-HASH partitions. Table has one Primary RANGE partitioned on Created_Date Column/ Monthly and 4 Subpartitions HASH on Primary Key Column. We have ~3.5 yrs+ data (42 Range Partition/168 Has...
Categories: DBA Blogs

Pro *C for beginners

Tom Kyte - 2 hours 31 min ago
Pro *C Hi Tom- Linux version 2.6.18-274.12.1.el5, gcc version 4.1.2 I would like to learn Pro *c programming from the beginning. Is there any way to write, run programs. I had stuck up with few issues when I self-started learning it. Plz, sugg...
Categories: DBA Blogs

Not able to run 2 merge on different partitions of same table in paralll

Tom Kyte - 2 hours 31 min ago
<code>When I am running 2 merge on different partitions of same table in parallel, one is going in wait state. Below is the scenario. Table A columns - Acct_skey, date_skey, prod_code, element_id,SYS_SRC_SOR_ID,DDA_SKEY, partition_name, ... m...
Categories: DBA Blogs

How to use bind variables in date interval functions

Tom Kyte - 2 hours 31 min ago
I am trying to write an sql, that takes two inputs at run time, if I am passing null then it looks at a database table and gets values from it. To make the NULL checks and then assign values from database, I implemented the logic as its given here h...
Categories: DBA Blogs

Bitmap index creation taking time

Tom Kyte - 2 hours 31 min ago
Hi Tom , The bitmap index creation on a partitioned table taking lot of time. The table is small 5.5 GB , however the index creation taking 3 hours even with parallel option. Could you please help to improve. Database is in 11.2.0.4 CREATE ...
Categories: DBA Blogs

Cross join having 10 billion intermediate records but 200K Final records

Tom Kyte - 2 hours 31 min ago
<code>Hello I working on the doing an application to customer match , where an application can have multiple customers (corporate) and customer can have multiple applications. so i am doing a cartesian join and using UTL_MATCH .JARO_WINK...
Categories: DBA Blogs

Can CLOB field be validated as XML?

Tom Kyte - 2 hours 31 min ago
I'm trying to extract specific fields from XML in a CLOB field. Unfortunately, it isn't stored as XMLTYPE, so I'm doing: EXTRACTVALUE(xmltype(ctr.rule_text), 'configuration/criteria/text()') Most of the values in the DB are valid XML, but some...
Categories: DBA Blogs

CPU sudden spike

Tom Kyte - 2 hours 31 min ago
Hi All, We observed a sudden increase in CPU usage in one of the Exa node on 10th April and it is still same till date. Example CPU utilization increased from 2% to 66 % and same till today. Can someone help me how to find what exactly caused th...
Categories: DBA Blogs

COMPOUND KEY

Tom Kyte - 2 hours 31 min ago
Hi, In Below Document i found this Text 'Oracle considers two nulls to be equal when evaluating a DECODE function. Please refer to DECODE for syntax and additional information. Oracle also considers two nulls to be equal if they appear in compo...
Categories: DBA Blogs

ORA-00923: FROM keyword not found where expected

Tom Kyte - 2 hours 31 min ago
(SQR 5528) ORACLE OCIStmtExecute error 923 in cursor 16: ORA-00923: FROM keyword not found where expected SQL: SELECT DISTINCT B10.EMPLID from PS_BEN_PROG_PARTIC a10, PS_EMPLOYEES b10, PS_BEN_DEFN_PGM d10 where a10.EMPLID = b10.EMPLID A...
Categories: DBA Blogs

Time difference between two date/times

Tom Kyte - Fri, 2017-04-28 18:06
Hey Guys Hoping you will be able to help with an issue I am having. I currently have information stored in a table as StartDate - stored as date (eg 18-APR-17) and StartTime - stored as a number(4,2) so 10.30am would be stored as 10.3 or 3.45pm...
Categories: DBA Blogs

lob tablespace -- moving a lob

Tom Kyte - Fri, 2017-04-28 18:06
How do I move lob indexes and lob segments from one tablespace to another.
Categories: DBA Blogs

How to find out, what stored procedure does a commit

Tom Kyte - Fri, 2017-04-28 18:06
This may sound silly, but I'm unable to figure out what damn stored procedure (or something else?) performs a commit after I manually execute another stored procedure. There is a complicated business logic implemented using stored procedures and, ...
Categories: DBA Blogs

Deliver Reports to Document Cloud Services!

Tim Dexter - Fri, 2017-04-28 16:32

Greetings !

In release 12.2.1.1, BI Publisher added a new feature - Delivery to Oracle Document Cloud Services (ODCS). Around the same time, BI Publisher was also certified against JCS 12.2.1.x and therefore, today if you have hosted your BI Publisher instance on JCS then we recommend Oracle Document Cloud Services as the delivery channel. Several reasons for this:

  1. Easy to configure and manage ODCS in BI Publisher on Oracle Public Cloud. No port or firewall issues.
  2. ODCS offers a scalable, robust and secure document storage solution on cloud.
  3. ODCS offers document versioning and document metadata support similar to any content management server
  4. Supports all business document file formats relevant for BI Publisher

When to use ODCS?

ODCS can be used for all different scenarios where a document need to be securely stored in a server that can be retained for any duration. The scenarios may include:

  • Bursting documents to multiple customers at the same time.
    • Invoices to customers
    • HR Payroll reports to its employees
    • Financial Statements
  • Storing large or extremely large reports for offline printing
    • End of the Month/Year Statements for Financial Institutions
    • Consolidated department reports
    • Batch reports for Operational data
  • Regulatory Data Archival
    • Generating PDF/A-1b or PDF/A-2 format documents

How to Configure ODCS in BI Publisher?

Configuration of ODCS in BI Publisher requires the  URI, username and password. Here the username is expected to have access to the folder where the files are to be delivered.



How to Schedule and Deliver to ODCS?

Delivery to ODCS can be managed through both - a Normal Scheduled Job and a Bursting Job.

A Normal Scheduled Job allows the end user to select a folder from a list of values as shown below


\

In case of Bursting Job, the ODCS delivery information is to be provided in the bursting query as shown below:


Accessing Document in ODCS

Once the documents are delivered to ODCS, they can be accessed by user based on his access to the folder, very similar to FTP or WebDAV access.

That's all for now. Stay tuned for more updates !

Categories: BI & Warehousing

Oracle OpenWorld 2017: Get Your Proposals In!

WebCenter Team - Fri, 2017-04-28 16:05

We've extended the deadline to submit proposals for Oracle OpenWorld 2017! We're currently accepting proposals for session ideas for the 2017 San Francisco event. But you need to hurry—the deadline for submitting is May 1, 2017! 

Attendees at the conference are eager to hear from experts on Oracle business and technology. They’re looking for insights and improvements they can put to use in their own jobs: exciting innovations, strategies to modernize their business, different or easier ways to implement, unique use cases, lessons learned, the best of best practices.

If you’ve got something special to share with other Oracle users and technologists, they want to hear from you, and so do we. Submit your proposal now for this opportunity to present at Oracle OpenWorld, the most important Oracle technology and business conference of the year.

Installing Hortonworks Data Platform 2.5 on Microsoft Azure

Jeff Moss - Fri, 2017-04-28 16:04

I presented this topic to the Big Data Meetup in Nottingham on Thursday but sometimes people prefer a blog to a presentation, so I’ve fashioned this article from the slides…

This article assumes the following:

Start by navigating to the Azure login page and enter your details. If you have never visited before your screen will look like this:

If you’ve logged in before the page will show your login and you can just click it:

After you login, you’ll arrive at the Dashboard:

Choose the “Marketplace” link at the bottom right, which leads to the following screen where you can type “HDP” and it will show you the options for Hortonworks Data Platform. There are currently two options 2.4 and 2.5 – I chose 2.5:

When you choose 2.5 it will bring up this screen which shows the details of the option you have chosen and offers you the “Create” button to go ahead and start the creation process – click on Create:

After clicking on Create, the process moves on to a five step wizard, the first step of which allows you to choose “Basic options” for the VM. I set the following options:

Name: oramosshdp25sandbox

VM Disk Type: SSD

User name: jeff

SSH Public key: my public SSH key

Subscription: Leave  set to Free Trial (if that’s what you are using, as per screenshot, or your Corporate/Pay As You Go subscription if you have one)

Resource Group: Create New called hdp25sandbox_rg

Location: UK West

A screenshot of these options looks like this:

Click on OK and move on to the 2nd step in the wizard for choosing the size of the VM. I chose the DS3_V2 size which seemed to work OK – you might be able to get away with something smaller, perhaps.

Click on Select and move on to step 3 of the wizard which is about configuring optional features. For this step I set the following:

Use managed disks: Yes

Leaving all other options as defaults this looks like:

Click on OK and move on to step 4 which is just a summary of the configuration:

If you’re happy, click on OK and move on to step 5 where you accept the terms of use and “buy” the VM:

If you’re happy, click on Purchase and that’s the end of the wizard. Azure then goes off to deploy the VM, which can take a few minutes. You’ll be returned to the dashboard screen where you’ll see the VM at the top right with the word Deploying on it:

As I say, it takes a few minutes to complete, but when it does, you’ll see a popup notification in the top right of the screen and the VM tile will change to look as below:

So, you now have the Sandbox VM up and running.

The VM by default only has inbound SSH access enabled and can only be accessed by IP address so we’ll make some changes to these next. First we’ll give the VM a DNS name which allows you to access it on the internet via a name rather than an IP address. From the dashboard screen (above) click on the VM and it takes you to this screen:

You’ll notice the Public IP address which is a hyperlink…click on that link and it takes you to the following screen where you can specify the DNS Name which means the machine will have a Fully Qualified Domain Name that you can access via the internet. I set my DNS Name to oramosshdp25sandbox and given I’d previously chosen to use UK West as the location, the Fully Qualified Domain Name is thus oramosshdp25sandbox.ukwest.cloudapp.azure.com as per the screenshot below:

Now, navigate to the Inbound Security Rules page which is under the Network Security Group page (access from the Resource List on the dashboard). Notice that the only rule existing is one to allow inbound SSH communication:

In order to facilitate additional capabilities you should open up a few more ports, as follows:

  • 8888 – HDP
  • 8080 – Ambari
  • 4200 – Web SSH access
  • 50070 – Default Node Name
  • 21000 – Atlas
  • 9995 – Zeppelin
  • 15000 – Falcon
  • 6080 – Ranger

Click on Inbound Security Rule which takes you to the page for maintaining these rules and enter the details for the 8888 port. I specified the name as default-allow-8888 and the port as 8888 as shown below:

Click on OK to create the rule. Carry out the same process for the other ports.

Now that we’ve undertaken these additional activities we can access the VM using an SSH terminal logging onto oramosshdp25sandbox.ukwest.cloudapp.azure.com as the user you have created (jeff in my case) and the private SSH key:

Whilst you are in the SSH terminal you can reset the Ambari password. This is not strictly necessary unless you want to login to Ambari as admin, but I’ll describe it anyway.

First become root with:

sudo su - root

Now SSH into the Docker Image as root:

ssh root@172.17.0.2

You will be prompted to change the password for root on this first login – the current password is hadoop.

After changing the password run the Ambari password reset process:

ambari-admin-password-reset

Follow the instructions to reset the password and after that it will start the Ambari server process.

Once all that is done, exit out of the sessions and the original SSH terminal.

Now go into HDP via the web interface by logging on to the following URL:

http://oramosshdp25sandbox.ukwest.cloudapp.azure.com:8888

The first time you access this URL you’ll be given a welcome (marketing) page which asks for your details:

Fill out the details and hit Submit which will take you to the main entry page for HDP:

Choose the Launch Dashboard option on the left, which brings up a pair of browser windows that use the entire desktop and show the Ambari login page on the left hand browser and the Tutorials website on the right hand browser like this:

You can use either the admin user that you just reset the password for or the predefined user raj_ops (password raj_ops) to access Ambari. Click on Sign In on the left hand browser once you entered the credentials and it takes you into the main Ambari homepage:

This is the main systems management environment for Hortonworks – more documentation here.

If we close this pair of browsers now and go back to the main HDP entry page and choose the Quick Links option on the right we get this page:

From here you can choose to use any of these specific components.

NOTE – I couldn’t get Atlas and Falcon to work – they need more configuration/setup to get them functional. Ranger, Zeppelin and the Web SSH client work fine though.

Just a basic introduction but I hope you find it useful.

Critical Patch Update for April 2017 Now Available

Steven Chan - Fri, 2017-04-28 10:59

The Critical Patch Update (CPU) for April 2017 was released on April 18, 2017. Oracle strongly recommends applying the patches as soon as possible.

The Critical Patch Update Advisory is the starting point for relevant information. It includes a list of products affected, pointers to obtain the patches, a summary of the security vulnerabilities, and links to other important documents. 

Supported products that are not listed in the "Supported Products and Components Affected" Section of the advisory do not require new patches to be applied.

The Critical Patch Update Advisory is available at the following location:

It is essential to review the Critical Patch Update supporting documentation referenced in the Advisory before applying patches.

The next four Critical Patch Update release dates are:

  • July 18, 2017
  • October 17, 2017
  • January 16, 2018
  • April 17, 2018
References Related Articles
Categories: APPS Blogs

How to create dashboard for MySQL database

Nilesh Jethwa - Fri, 2017-04-28 10:12

MySQL is the most popular Open Source SQL database management system. Due to its reliability and easy-to-use structure, MySQL is widely used within almost any industry. It is used by both small and big businesses. Among MySQL users you can find such huge companies like Facebook, Youtube, Twitter,etc.

Using MySQL with InfoCaptor does not require any JDBC driver as it uses native PHP drivers. In case you need to use JDBC drivers, please check this resource. MySQL Connector/J is the official JDBC driver for MySQL.

Are you using MySQL for your data marts or data-warehouse? If so, build your Free MySQL dashboard software.

Read more at http://www.infocaptor.com/ice-database-connect-dashboard-to-mysql-sql

Oracle E-Business Suite 12.2 Mobile Application Security

This is the tenth posting in a blog series summarizing the new Oracle E-Business Suite 12.2 Mobile and web services functionality and recommendations for securing them.

Oracle Corporation has been building out Mobile and Smartphone applications for the Oracle E-Business Suite for a number of releases. Before release 12.2.5, this functionality was designed only for deployment through a corporate VPN, not through an Oracle E-Business Suite external node over the Internet (e.g. a server in DMZ).

With release, 12.2.5 external node deployment for Mobile applications is now an option. 12.2.5 bundles Oracle Mobile v4 and uses the E-Business Suite's WebLogic server.  Specifically, 12.2.5 deploys the Oracle Mobile v4 REST services through the OAFM WebLogic application.  In other words, with 12.2.5, Smartphone applications can now be Internet deployed without a need for a separate WebLogic Server; no need for a SOA Server or a separate WebLogic server.

Oracle Mobile Using Native EBS REST

To secure version 12.2.5 Oracle E-Business Suite Mobile applications, Oracle Mobile Security Services (OMSS) is used.  Check with your Oracle sales representative if OMSS is separately licensed or not. OMSS provides critical URL shortening as well as white/blacklisting and other functionality specific to deploying Oracle Mobile applications. OMSS must be properly configured and is placed in front of OAFM.

OMSS in-line before OAFM

If you have any questions, please contact us at info@integrigy.com

-Michael Miller, CISSP-ISSMP, CCSP, CCSK

References
 
 
 
 
 
 
Web Services, DMZ/External, Oracle E-Business Suite
Categories: APPS Blogs, Security Blogs

Eexecute an Oracle stored procedure via a database link

Tom Kyte - Thu, 2017-04-27 23:46
Hi, I have procedure via database link, and getting this error ORA-02064: distributed operation not supported This is because need to apply pragma autonomous_transaction. But is there any other way? some of the procedure we need to reuse, a...
Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator