Another annual DOAG conference has passed, and I can only say the very best about it: Perfectly organized, large and modern location, impressive list of well known speakers and over 2100 attendees – wow!
My presentation Best of RMAN was scheduled at the first slot on the first day, so I was a bit concerned whether many people would attend that early. It turned out that the room got so full that I was asked by the organizers to deliver the same talk again next day – which I happily did, again with a packed room :-)
Apart from speaking myself, I enjoyed very much to see friends and colleagues again, as well as people I knew from social media before but never met in person yet. Thank you all for your good company guys – I may appear a bit quiet and distanced sometimes, but I really appreciate you and our little conversations. Personal highlight: The lady who approached me at the stairs, shaking hands and telling me that she knew me from an Oracle University class that she liked so much :-)
There were many interesting presentations, just to mention some of those I attended myself:
Frits Hoogland with Oracle Exadata and database memory
In his very detailed talk, he emphasized the importance of using Huge Pages on Exadata and that MEMORY_TARGET aka Automatic Memory Management is not an option here.
Jonathan Lewis presented Five Hints for Optimising SQL
I’m always amazed how much this man knows and how good he is able to explain it to us mere mortals :-)
Lothar Flatz was presenting Anatomie eines SQL Befehls (how to do Oracle Performance Tuning with a scientific approach)
During his very entertaining talk, he quoted no less than seven Oakies (including himself), so that was quite good advertising for the OakTable Network :-)
Frank Schneede delivered Eine Datenbank kommt selten allein (DB Cloning on Exadata using sparse diskgroups)
while Ulrike Schwinn presented Oracle DB Memory Techniken für mehr Performance (Overview about the meanwhile so many different ways to deal with memory in Oracle)
Couldn’t really catch her because she was running out of the picture all the time :-)
Martin Berger also did an interesting talk: Wie misst Oracle die Verwendung von Database Features? (How to make sense of DBA_FEATURE_USAGE STATISTICS)
I liked his presentation style with many live demonstrations very much!
My dear colleague Joel Goodman talked about Automatic Parallel Execution
Joel is what I call a ‘Standup Instructor’ – mention any database related topic and he will be able to deliver an excellent 30 minutes talk about it instantly :-)
A big THANK YOU to the many people from DOAG who helped to make that event take place again in such an impressive way! Hope to be there again next year.
First we see the facelift in the Open-Dialog. It's now the typical Windows-Dialog, where you have much more flexibility.
New features in the Convert-Dialog:
- New Feature "Converting to and from XML"
- Checkbox "Overwrite"
- Checkbox "Keep window open", if you have to convert many files at once.
Preferences-Dialog, Tab "General":
- The checkbox "Hide PL/SQL Compile Dialog" is new
- Web Browser location (Forms 11g: Tab "Runtime")
- Compiler Output is new
Tab "Subclass": No changes
Tab "Wizards": No changes
- The checkbox "Show URL Parameters" is new
- Application Server URL is much bigger!
- The Web Browser location vanished to Tab "General"
Are you attending UKOUG Apps15 (#ukoug_apps15) or Tech15 (#ukoug_tech15)? If so, you are in luck! Once again we will run our ever popular scavenger hunt with a twist. From December 7-9 we will be at the ICC in Birmingham, UK working with the UKOUG team to give you a fun way to explore the conference and win prizes along the way.
If you’re not familiar with this game, it is as if we are stamping a card (or your arm!) every time you do a task. But instead, we are using IoT technologies; Raspberry Pi, NFC stickers, and social media to give you points. This time we will hold a daily drawing. You only need to enter once, but the more tasks you complete the more chances you have to win. Each day has different tasks.
This is a great way to discover new things, make friends, and enjoy the conference from a different perspective.
You can pre-register here http://bit.ly/UKOUG15Explorer or come over during registration so we can set you up. See you there!Possibly Related Posts:
- OTN Community Quest
- Our Week at UKOUG
- The Week That Was Kscope15
- Wondering about the Raspberry Pi
- Game Mechanics of a Scavenger Hunt
At this year’s Web Summit in Dublin, Ireland, I had the opportunity to observe thousands of attendees. They came from 135 different countries and represented different generations.
Despite these enormous differences, they came together and communicated.
But how? With all of the hype about how different communication styles are among the Baby Boomers, Gen Xers, Millennials, and Generation Zers, I expected to see lots of small groupings of attendees based on generation. And I thought that session audiences would mimic this, too. But I could not have been more wrong.
How exactly, then, did speakers, panelists, and interviewers keep the attention of attendees in the 50+ crowd, the 40+ crowd, and the 20+ crowd while they sat in the same room?
The answer is far simpler than I could have imagined: Authenticity. They kept their messages simple, specific, honest, and in context of the audience and the medium in which they were delivering them.
Web Summit: Estée Lalonde (@EsteeLalonde) in conversation at the Fashion Summit session "Height, shoe size and Instagram followers please?"
Simplicity in messaging was key across Web Summit sessions: Each session was limited to 20 minutes, no matter whether the stage was occupied by one speaker or a panel of interviewees. For this to be successful, those onstage needed to understand their brands as well as the audience and what they were there to hear.
Attention spans are shortening, so it’s increasingly critical to deliver an honest, authentic, personally engaging story. Runar Reistrup, Depop, said it well at the Web Summit when he said:
Web Summit: Runar Reistrup (@runarreistrup) in conversation during the Fashion Summit session "A branding lesson from the fashion industry"
While lots of research, thought, and hard work goes into designing and building products, today’s brand awareness is built with social media. Users need to understand the story you’re telling but not be overwhelmed by contrived messaging.
People want to connect with stories and learn key messages through those stories. Storytelling is the important challenge of our age. And how we use each social medium to tell a story is equally important. Storytelling across mediums is not a one-size-fits-all experience; each medium deserves a unique messaging style. As Mark Little (@marklittlenews), founder of Storyful, makes a point of saying, "This is the golden age of storytelling."
The Oracle Applications User Experience team recognizes this significance of storytelling and the importance of communicating the personality of our brand. We take time to nurture connections and relationships with those who use our applications, which enables us to empathize with our users in authentic ways.
Web Summit: Áine Kerr (@AineKerr) talking about the art of storytelling
The Oracle simplified user interface is designed with consideration of our brand and the real people—like you—who use our applications. We want you to be as comfortable using our applications as you are having a conversation in your living room. We build intuitive applications that that are based on real-world stories—yours—and that solve real-world challenges that help make your work easier.
We experiment quite a bit, and we purposefully “think as if there is no box.” (Maria Hatzistefanis, Rodial)
Web Summit: Maria Hatzistefanis (@MrsRodial) in conversation during the Fashion Summit session "Communication with your customer in the digital age"
We strive for finding that authentic connection between the simplified user interface design and the user. We use context and content (words) to help shape and inform what message we promote on each user interface page. We choose the words we use as well as the tone carefully because we recognize the significance of messaging, whether the message is a two-word field label or a tweet. And we test, modify, and retest our designs with real users before we build applications to ensure that the designs respond to you and your needs.
If you want to take advantage of our design approach and practices, download our simplified user experience design patterns eBook for free and design a user experience that mimics the one we deliver in the simplified user interface. And if you do, please let us know what you think at @usableapps.
This marks the 1000th post to the Infogram. I am awarding myself a low-carb lollipop.
Little things to know about ... Oracle Partitioning (part one of hopefully many), from The Data Warehouse Insider.
Oracle VM Performance and Tuning - Part 4, from Virtually All The Time.
Changing Appearances: Give The Apps Your Corporate Look, from Fusion Applications Developer Relations.
Offloading row level security to Smart Scan, from SmartScan Deep Dive.
Patch Set Update: Oracle Data Relationship Management 220.127.116.11.321, from Business Analytics - Proactive Support.
Oracle and Adaptive Case Management: Part 1 , from SOA & BPM Partner Community Blog.
Remote debugging of nashorn scripts with NetBeans IDE using "debugger" statements, from A. Sundararajan's Weblog.
From the Oracle E-Business Suite Support blog:
EBS HCM December Updates To Employer Shared Responsibility Reporting Under The Affordable Care Act
From the Oracle E-Business Suite Technology blog:
WebCenter Portal 18.104.22.168 Certified with E-Business Suite 12.2
We ran a five day Oracle Security training event in York, England from September 21st to September 25th at the Holiday Inn hotel. This proved to be very successful and good fun. The event included back to back teaching by....[Read More]
Posted by Pete On 22/10/15 At 08:49 PM
I wrote a presentation on designing and building practical audit trails back in 2012 and presented it once and then never again. By chance I did not post the pdf's of these slides at that time. I did though some....[Read More]
Posted by Pete On 01/10/15 At 05:16 PM
Oracle Application Express is a great rapid application development tool where you can write your applications functionality in PL/SQL and create the interface easily in the APEX UI using all of the tools available to create forms and reports and....[Read More]
Posted by Pete On 21/07/15 At 04:27 PM
How does Oracle Security and Electronic mix together? - Well I started my working life in 1979 as an apprentice electrician in a factory here in York, England where I live. The factory designed and built trains for the national....[Read More]
Posted by Pete On 09/07/15 At 11:24 AM
In the last few years I have not done as many conference speaking dates as I used to. This is simply because when offered they usually clashed with pre-booked work. I spoke for the UKOUG in Dublin last year and....[Read More]
Posted by Pete On 06/07/15 At 09:40 AM
Make a Sad Face..:-( I seemed to have missed my blogs tenth which happened on the 20th September 2014. My last post last year and until very recently was on July 23rd 2014; so actually its been a big gap....[Read More]
Posted by Pete On 03/07/15 At 11:28 AM
I wrote a paper about Oracle Database Vault in 12c for SANS last year and this was published in January 2015 by SANS on their website. I also prepared and did a webinar about this paper with SANS. The Paper....[Read More]
Posted by Pete On 30/06/15 At 05:38 PM
I have just updated all of our Oracle Security training offerings on our company website. I have revamped all class pages and added two page pdf flyers for each of our four training classes. In have also updated the list....[Read More]
Posted by Pete On 25/06/15 At 04:36 PM
Using legal threats as an extension of your marketing is a bad idea. At least, it’s a bad idea in the United States, where such tactics are unlikely to succeed, and are apt to backfire instead. Splunk seems to actually have had some limited success intimidating Sumo Logic. But it tried something similar against Rocana, and I was set up to potentially be collateral damage. I don’t think that’s working out very well for Splunk.
Specifically, Splunk sent a lawyer letter to Rocana, complaining about a couple of pieces of Rocana marketing collateral. Rocana responded publicly, and posted both the Splunk letter and Rocana’s lawyer response. The Rocana letter eviscerated Splunk’s lawyers on matters of law, clobbered them on the facts as well, exposed Splunk’s similar behavior in the past, and threw in a bit of snark at the end.
Now I’ll pile on too. In particular, I’ll note that, while Splunk wants to impose a duty of strict accuracy upon those it disagrees with, it has fewer compunctions about knowingly communicating falsehoods itself.
1. Splunk’s letter insinuates that Rocana might have paid me to say what I blogged about them. Those insinuations are of course false.
Splunk was my client for a lot longer, and at a higher level of annual retainer, than Rocana so far has been. Splunk never made similar claims about my posts about them. Indeed, Splunk complained that I did not write about them often or favorably enough, and on at least one occasion seemed to delay renewing my services for that reason.
2. Similarly, Splunk’s letter makes insinuations about quotes I gave Rocana. But I also gave at least one quote to Splunk when they were my client. As part of the process — and as is often needed — I had a frank and open discussion with them about my quote policies. So Splunk should know that their insinuations are incorrect.
3. Splunk’s letter actually included the sentences
Splunk can store data in, and analyze data across, Splunk, SQL, NoSQL, and Hadoop data repositories. Accordingly, the implication that Splunk cannot scale like Hadoop is misleading and inaccurate.
I won’t waste the time of this blog’s readers by explaining how stupid that is, except to point out that I don’t think Splunk executes queries entirely in Hadoop. If you want to consider the matter further, you might consult my posts regarding Splunk HPAS and Splunk Hunk.
4. I and many other people have heard concerns about the cost of running Splunk for high volumes of data ingest. Splunk’s letter suggests we’re all making this up. This post suggests that Splunk’s lawyers can’t have been serious.
Monday 7th Dec: 11:20 – 12:05
I’ve arranged a panel session on the Cost Based Optimizer for UKOUG Tech 2015, with Maria Colgan, Nigel Bayliss, and Chris Antognini joining me to answer (and maybe argue about) questions from the audience.
To keep things moving along, we aim to have a few questions available before the day and collect a few questions at the door as well as accepting questions from the floor. Martin Widlake and Neil Chandler will be acting as MCs, wielding the microphones, and collecting questions at the door.
So far we have a handful of questions – if you have a question you’d like to raise, add it to the comments below. Even if you can’t attend we may get to your question, and I may be able to write up a summary of questions and answers after the event.
Questions so far:
- Why do queries that have been working fine for months suddenly change to a very slow plan ?
- Are smaller, simpler queries less likely to have a bad plan or change from a good plan to a bad one ?
- What is the down-side of using SQL Profiles to lock in plans ?
- What’s the best way to collect stats on a partitioned table where you add a daily / weekly / monthly partition ?
- Is there a good way to make sure you’ve collected stats for interval partitions when you don’t know they’ve been added ?
- Why do the manuals talk about “prefixed” local indexes (on partitioned tables) as if they have some special effect ?
Bryn very kindly stepped in for Tom Freyer who unfortunately couldn’t make it. Bryn’s presentation was called Why use PL/SQL?, which was based on his whitepaper of the same name. I’ve written about a number of the topics he raised, including a piece on My Utopian Development Environment, but his talk and his whitepaper put forward a much more consistent and compelling argument. Definitely worth checking out if you are a PL/SQL expert or newbie!
Along the way, he used a little piece of syntax I have never noticed before, which has been available for the whole 20+ years I’ve been using PL/SQL. I’ll write about that in a post tomorrow if I get chance.
By the way, Bryn loves being called Brian, and he loves it when you say S.Q.L. and PL/S.Q.L. rather than Sequel and PL/Sequel.
Next up was Neil with a session called “Why has my plan changed?”. I’ve written about many of the issues discussed in this session, but Neil crystallised a whole array of points really well, and used live demos which I always like to see. Regardless of how well you think you know the subject, I would be surprised if you don’t walk away with a few new things, which you can later pretend you always knew.
Both guys will be presenting these sessions at UKOUG Tech15 if you want to catch up with them!
Thanks to the guys for coming to speak to us. Thanks to Bryn’s mom, for living in the UK, giving him an excuse to come to UKOUG Tech15 early and therefore be available to step in at short notice. Thanks as always to Red Stack Tech for sponsoring the event, allowing it to remain free, and to Mike for organising it!
Tim…Oracle Midlands : Event #12 – Summary was first posted on November 25, 2015 at 12:53 am.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.
What did you miss at Oracle OpenWorld regarding WebCenter, The Cloud, and Front-End Website and Portal Development? Find out in this webinar.
Fishbowl Solutions has been attending and sponsoring Oracle OpenWorld for many years, and every year we get questions from customers asking why they should attend, how is Oracle OpenWorld different from Collaborate, and if they do attend what sessions should they go to? Although there is no single answer to these questions, our general response is OpenWorld provides so many continuous learning, networking, and fun events that going isn’t hard to justify. However, as the old saying goes “you get out of it what you put into it” definitely applies to OpenWorld. The event is so big (60,000+ attendees), so busy (1,000+ sessions), and so much fun (Elton John concert), that first-time attendees and even alumni need to build out their schedules before the event. This can help ensure they are going to the sessions that will teach them something new that they can apply within their work, or they learn about the road map for the product areas they manage so that they can understand and plan for upgrades, new technology, or integrations.
Even with this proactive planning, attending OpenWorld just isn’t an option for all WebCenter customers, so my colleagues and I at Fishbowl will be hosting a webinar on December 10th that will give you a recap of what you missed as it applies to Oracle WebCenter 12c; The Cloud – Sites Cloud Service, Documents Cloud Service, Mobility Cloud Service, and Process Cloud Service; and new front-end development technologies that can be used for WebCenter-based websites and portals. The webinar will include a Q & A session, so even if we don’t cover a topic that you are interested in or have a question about, our WebCenter and Cloud experts will be available to fill you in. The webinar details are included below, and we hope you will be able to join us.
Date: December 10, 2015
Time: 1:00 – 2:00 PM EST, 12 – 1:00 PM CST
The Old Ways of Doing Things
Traditionally, data management has been a cumbersome process, to say the least. Usually, it means data replication, data management or using intermediary connectors and servers to pull off point-to-point integration. Of course, in some situations, it’s a combination of the three.
Like we just said, though, these methods were never really ideal. Instead, they were just the only options given the complete lack of alternatives available. That’s the main reason you’re seeing these methods less and less. The moment something better came along, companies jumped all over them. However, their diminishing utility can also be traced to three main factors. These would be:
- · High costs related to data movement
- · The astronomical growth in data (also referred to as Big Data)
- · Customers that expect real-time information
Introducing Data Virtualization
It was only a matter of time before this problem was eventually addressed. Obviously, when so many companies are struggling with this kind of challenge, there’s quite the incentive for another one to solve it.
That’s where data virtualization comes into play. Companies that are struggling with having critical information spread out across their entire enterprise in all kinds of formats and locations never have to worry about the hardships of having to get their hands on it. Instead, they can use virtualization platforms to search out what they need.
Flexible Searches for Better Results
It wouldn’t make much sense for this type of software to not have a certain amount of agility built in. After all, that’s sort of its main selling point. The whole reason companies invest in it is because it doesn’t get held back by issues with layout or formatting. Whatever you need, it can find.
Still, for best results, many now offer a single interface that can be used to separate and extract aggregates of data in all kinds of ways. The end result is a flexible search which can be leverage toward all kinds of ends. It’s no longer about being able to find any type of information you need, but finding it in the most efficient and productive way possible.
Keep Your Mainframe
One misconception that some companies have about data virtualization is that it will need certain adjustments to be made to your mainframe before it can truly be effective. This makes sense because, for many platforms, this is definitely the case. These are earlier versions, though, and some that just aren’t of the highest quality.
With really good versions, though, you can basically transform your company’s mainframe into a virtualization platform. Such an approach isn’t just cost-effective. It also makes sure you aren’t wasting resources, including time, addressing the shortcomings of your current mainframe, something no company wants to do.
Don’t get turned off from taking a virtualization approach to your cache of data because you’re imagining a long list of chores that will be necessary for transforming your mainframe. Instead, just be sure you invest in a high-end version that will actually transform your current version into something much better.
A Better Approach to Your Current Mainframe
Let’s look at some further benefits that come from taking this approach. First, if the program you choose comes with the use of a high-performance server, you’ll immediately eliminate the redundancy of integrating from point-to-point. This will definitely give you better performance in terms of manageability. Plus, if you ever want to scale up, this will make it much easier to do so.
Proper data migration is key to a good virtualization process. If it is done right the end user wont have to worry out corrupted data and communication between machines will be crystal clear.If you divert the data mapping you need to do at processing-intensive level and transformation processes away from the General Purpose Processor of your mainframe to the zIIP specialty engine, you’ll get to dramatically reduce your MIPS capacity usage and, therefore, also reduce your company’s TCO (Total Cost of Ownership).
Lastly, maybe you’d like to exploit of every last piece of value you derive from your mainframe data. If so, good virtualization software will not only make this possible, but do so in a way that will let you dramatically turn all of your non-relational mainframe data virtualization into relational formats that any business analytics or intelligence application can use.
Key Features to Look for in Your Virtualization Platform
If you’re now sold on the idea of investing in a virtualization platform, the next step is getting smart about what to look for. As you can imagine, you won’t have trouble finding a program to buy, but you want to make sure it’s actually going to be worth every penny.
The first would be, simply, the amount of data providers available. You want to be able to address everything from big data to machine data to syslogs, distributed and mainframe. Obviously, this will depend a lot on your current needs, but think about the future too.
Then, there’s the same to think about in terms of data consumers. We’re talking about the cloud, analytics, business intelligence and, of course, the web. Making sure you will be able to stay current for some time is very important. Technology changes quickly and the better your virtualization process is the longer you’ll have before having to upgrade. Look closely at the migration process, and whether or not the provider can utilize your IT team to increase work flow. This will help you company get back on track more quickly and with better results.
Finally, don’t forget to look at costs, especially where scalability is concerned. If you have plans of getting bigger in the future, you don’t want it to take a burdensome investment to do so. As you can see, virtualization platforms definitely live up to the hype.You just have to be sure you spend your money on the right kind.
Mike Miranda writes about enterprise software and covers products offered by software companies like Rocket software about topics such as Terminal emulation, Enterprise Mobility and more.
Oracle 12c introduced a new capability to move a partition online, without any interruptions to DML happening at the same time. But, there’s a catch. So far we’ve been able to use basic table compression without having to worry about any extra licensing – it was just a plain EE feature.
If you are planning to use the online partition move functionality, carefully check if you’re not using basic compression anywhere. For example:
create tablespace data datafile '+DATA' size 1g
create user foo identified by bar
default tablespace data
quota unlimited on data
grant create session, create table to foo
create table test (x int, y varchar2(20))
partition by range (x)
partition p1 values less than (100) tablespace data compress,
partition p2 values less than (200) tablespace data,
partition p3 values less than (300) tablespace data
So we now have this, and our licensing is still as we know it:
select partition_name, compression, compress_for from user_tab_partitions
PARTITION_NAME COMPRESS COMPRESS_FOR
------------------------------ -------- ------------------------------
P1 ENABLED BASIC
We can use the new feature on partition p3:
alter table test move partition p3
Or, we can use the traditional means to compress the partition p2:
alter table test move partition p2
But as soon as we do this move “online”, we are required to purchase the Advanced Compression Option:
alter table test move partition p2
And, even sneakier:
alter table test move partition p1
Notice how partition p1 – which was previously compressed – also was online moved to a compressed format:
select partition_name, compression, compress_for from user_tab_partitions
PARTITION_NAME COMPRESS COMPRESS_FOR
—————————— ——– ——————————
P1 ENABLED BASIC
P2 ENABLED BASIC
And that, therefore, required the Advanced Compression Option.
Also note that the usage of this is not caught by dba_feature_usage_statistics (tested on 22.214.171.124):
select name, currently_used from dba_feature_usage_statistics where lower(name) like '%compress%';
Oracle Advanced Network Compression Service FALSE
Backup ZLIB Compression FALSE
Backup BZIP2 Compression FALSE
Backup BASIC Compression FALSE
Backup LOW Compression FALSE
Backup MEDIUM Compression FALSE
Backup HIGH Compression FALSE
Segment Maintenance Online Compress FALSE
Compression Advisor FALSE
SecureFile Compression (user) FALSE
SecureFile Compression (system) FALSE
Advanced Index Compression FALSE
Hybrid Columnar Compression FALSE
Hybrid Columnar Compression Row Level Locking FALSE
15 rows selected.
I also tried to bounce the database and the data wasn’t updated in my tests. I would’ve expected this to show up under “Segment Maintenance Online Compress”, but in my tests, it did not.
This feature restriction isn’t documented anywhere in the official product documentation – at least not that I could find. The only place where I could find this information was in this Oracle document.
Discover more about our experience in the world of Oracle.