Feed aggregator

Update on InteliVideo

Bradley Brown - Sat, 2012-05-05 01:54
I'm making it easy for people to upload their content, create classes, and ultimately monetize their knowledge.

We all know that DVDs are going away. Everyone is watching movies on their iPads and iPhones now. You see kids watching movies at restaurants on iPhones. My company, InteliVideo has built a platform that helps those companies with libraries of DVDs move into the world of streaming and downloadable videos for any device. It's a marketplace and companies can do everything themselves - it's entirely self service. We collect the money, keep track of who bought what and how long they can watch it, and ultimately deliver the content using our cloud-based platform.

This is an example of an IFRAME that an owner can pull off my site.

Here are the main categories:
 

Here is a specific category:


And another - try rolling your mouse over these...:



And the final category:

What are the JCA Adapters that are officially supported to work with Oracle Service Bus 11.1.1.6?

Ramkumar Menon - Thu, 2012-05-03 16:58

Refer to Section 25.2.1 of the Oracle Service Bus Developer's Guide. Excerpt from the section inlined below.

Technology Adapters

1. Custom JCA adapters
2. Oracle JCA Adapter for AQ
3. Oracle JCA Adapter for Database
4. Oracle JCA Adapter for Files/FTP
5. Oracle JCA Adapter for Sockets
6.Oracle BAM Adapter (Business Activity Monitoring)

Application Adapters

1. Oracle Adapter for Oracle Applications.
2. PeopleSoft (Oracle Application Adapters 10g)
3. SAP R/3 (Oracle Application Adapters 10g)
4. Siebel (Oracle Application Adapters 10g)
5. J.D. Edwards (Oracle Application Adapters 10g)

What are the JCA Adapters that are officially supported to work with Oracle Service Bus 11.1.1.6?

Ramkumar Menon - Thu, 2012-05-03 16:58

Refer to Section 25.2.1 of the Oracle Service Bus Developer's Guide. Excerpt from the section inlined below.

Technology Adapters

1. Custom JCA adapters
2. Oracle JCA Adapter for AQ
3. Oracle JCA Adapter for Database
4. Oracle JCA Adapter for Files/FTP
5. Oracle JCA Adapter for Sockets
6.Oracle BAM Adapter (Business Activity Monitoring)

Application Adapters

1. Oracle Adapter for Oracle Applications.
2. PeopleSoft (Oracle Application Adapters 10g)
3. SAP R/3 (Oracle Application Adapters 10g)
4. Siebel (Oracle Application Adapters 10g)
5. J.D. Edwards (Oracle Application Adapters 10g)

If You Are Interested In OUD - You Need To Be Reading Sylvain Duloutre's Blog

Mark Wilcox - Wed, 2012-05-02 14:38
My colleague Sylvain Duloutre is writing a series of posts about Oracle Unified Directory (OUD) including how to co-habitate and migrate from DSEE to OUD which is how we believe most existing DSEE customers who adopt OUD will make the move.You can read his blog here.

If You Are Interested In OUD - You Need To Be Reading Sylvain Duloutre's Blog

Mark Wilcox - Wed, 2012-05-02 14:38
My colleague Sylvain Duloutre is writing a series of posts about Oracle Unified Directory (OUD) including how to co-habitate and migrate from DSEE to OUD which is how we believe most existing DSEE customers who adopt OUD will make the move.
You can read his blog here.

An only too easy trap...

Nuno Souto - Wed, 2012-05-02 07:20
The few who follow this blog know I don't at all like the way Oracle is slowly forcing customers to use the OCM and direct Oracle support links, for patching and upgrades. Why? It's very simple: our db servers are in an intranet, in a designated set of subnets that will N-E-V-E-R be open to anything past the DMZ. And even then only as the originators and to a known IP address. This, I stress Noonshttp://www.blogger.com/profile/04285930853937157148noreply@blogger.com6

Collaborate 2012 – one of the best ever!

Andrews Consulting - Fri, 2012-04-27 10:14
Attendance at Collaborate this year was up roughly 50% from 2011 with over 1,100 people from JDE customer organizations and a total of over 6,000 attendees. The move back to Vegas certainly helped but the turnout is also a sign of an improving business atmosphere and a renewed interest in investing in IT. Monday at […]
Categories: APPS Blogs

Interested in learning about Database Life Cycle Management via EM 12c ?

Pankaj Chandiramani - Fri, 2012-04-27 05:43

Database Life Cycle Management via EM 12c eliminates manual and time consuming tasks related to discovery, Database provisioning, patching, configuration management, ongoing change
management and disaster protection automation.


Click here to register for a live webcast.

Categories: DBA Blogs

Webcast :Unleash the Power of Oracle Enterprise Manager to Build Private Clouds

Pankaj Chandiramani - Wed, 2012-04-25 02:05

There is an interesting webcast happening on 25th April on Building private cloud using EM which covers everything from Setup to Meeting & Charge-back , below are the details : 



Unleash the Power of Oracle Enterprise Manager to Build Private Clouds


Cloud is undergoing rapid adoption. As IT
organizations get flooded with requests for new resources, they are
looking for large scale automation for provisioning, monitoring and
administering resources. In this highly technical session, you will
learn from architects how the Oracle Enterprise Manager 12c Cloud
Management Pack can enable IaaS (Infrastructure as a Service) and DBaaS
(Database as a Service) involving self-service provisioning, monitoring
and chargeback in your enterprise and help you develop an enterprise
private cloud in quick time.

In this live session, we will demonstrate and review the following cloud-related topics:



  • Plan and Setup the Cloud Environment

  • Enabling Consolidation Cost Savings

  • Self Service Access

  • Monitor and Manage Cloud Operations

  • Meter, Chargeback, and Optimize Cloud Services


Click here to register for a live webcast.


Categories: DBA Blogs

Why Oracle President Mark Hurd won’t be at Collaborate

Andrews Consulting - Mon, 2012-04-23 11:41
At the first few Collaborate conferences after Oracle acquired PeopleSoft and JD Edwards then President Charles Phillips gave the opening keynote speech. A key message was that Oracle cared deeply about all of its application customers and especially those new to Oracle. I was fortunate enough to interview him on two of those occasions. His […]
Categories: APPS Blogs

Teradata Universe - Dublin

Donal Daly - Sun, 2012-04-22 13:29
Back in Dublin, for the Teradata Universe Conference. Looking forward to talking with customers and colleagues from around Europe.

We have a booth for Aster, as part of the main Teradata booth on the Expo floor. Went to check it out today and get it ready. Looks good. You'll find me or Vic Winch or Mike Whelan there. Drop by to say hello. This is how it looks post setup





And from the outside, all quiet at the moment as it is Sunday afternoon

On the Wednesday Duncan Ross & I will host the Analytics for Big Data SIG

It is going to be a very busy week!

Oracle, APEX and Longevity!

Duncan Mein - Fri, 2012-04-20 13:46
I have just read the latest Statement of Direction from Oracle (download here) and there are some comforting statements from Oracle about there long term commitment to Application Express.

For those like me who have been using APEX since the early days, this is good news as it represents a very clear intent that Oracle are very serious about APEX as key development framework.

Nice read!
Cheers Oracle.

Navigating the hype around Big Data to yield business value from it

Donal Daly - Fri, 2012-04-20 07:16
As the Big Data phenomenon continues to gather momentum, more and more organizations are starting to recognize the unexploited value in the vast amounts of data they hold. According to IDC, the Big Data technology and services market will grow to about $17 billion by 2015, seven times the growth rate of the overall IT market.

Despite the strong potential commercial advantage for business, developing an effective strategy to cope with existing and previously unexplored information could prove tough for many enterprises.

In many ways, this is because the term ‘Big Data’ itself is somewhat misleading. One definition is in terms of terabytes and petabytes of information that common database software tools cannot capture, manage and process within an acceptable amount of time. In reality, data volume is just one aspect of the discussion and arguably the most straightforward issue that needs to be addressed.

As Gartner points out; ‘the complexity, variety and velocity with which it is delivered combine to amplify the problem substantially beyond the simple issues of volume implied by the popular term Big Data.’ For this reason, ‘big’ really depends on the starting point and the size of the organization.

With so much being written about Big Data these days, it can prove difficult for enterprises to implement strategies that deliver on the promise of Big Data Analytics. For example I have read many online articles equating "MapReduce" with "Hadoop" and "Hadoop" with "Big Data". 

MapReduce is, of course, a programming model that enables complex processing logic expressed in Java and other programming languages to be parallelised efficiently, thus permitting their execution on "shared nothing", scale-out hardware architectures and Hadoop is one implementation of the MapReduce programming model.  There are other implementations of the MapReduce model – and there are other approaches to parallel processing, which are a better fit with many classes of analytic problems.  However we rarely see these alternatives discussed.

Another interesting assertion I read and sometimes I am confronted with by customers new to Hadoop is the positioning of Hadoop as an alternative to existing, SQL-based technologies that is likely to displace – or even entirely replace – these technologies.  This can often lead to an interesting discussion, but you could summarize that Hadoop lacks important capabilities found in a mature and sophisticated data warehouse RDBMS, for example: query re-write and cost-based query optimization; mixed-workload management; security, availability and recoverability features; support for transactions; etc., etc., etc.
 

There is, of course, a whole ecosystem springing-up around Hadoop – including HBase, Hive, Mahout and ZooKeeper, to name just four – and some commentators argue that in time these technologies may extend Hadoop to the point where this ecosystem could provide an alternative to existing Data Warehouse DBMS technology. 
 Possibly, but I would suggest that they have a long an arduous path to reach such a goal.

None of which is to say that Hadoop is not an extremely interesting and promising new technology – because clearly it is, and has role as enterprises embrace Big Data Analytics.  There is evidence today, from leading e-business companies that Hadoop scales well - and has a unit-cost-of-storage that will increasingly make it possible for organizations to "remember everything", by enabling them to retain data whose value for analytics is as yet unproven.
  
 
Hadoop may become the processing infrastructure that enables us to process raw, multi-structured data and move it into a "Big Analytic" environment - like Teradata-Aster - that can more efficiently support high-performance, high concurrency manipulation of the data, whilst also providing for improved usability and manageability, so that we can bring this data to a wider audience.  The final stage in this “Big Data value chain” will the see us move the insights derived from the processing of the raw multi-structured data in these "up stream" environments into the Data Warehouse, where it can most easily and most efficiently be combined with other data - and shared with the entire organization, so in order to maximize business value.

Teradata continues to invest in partnerships with leading Hadoop distributors Cloudera and Hortonworks - and to develop and enhance integration technology between these environments and the Teradata and Teradata-Aster platforms.

The fact that Big Data is discovery-oriented and its relative immaturity compared with traditional analytics, arguably means that it doesn’t sit well within the IT department because requirements can never be fully defined in advance. Neither should it logically fall to business analysts used to using traditional BI tools.

As a result, a new role has emerged for data scientists, who are not technologists but are also not afraid of leveraging technology. Rather than seeking an answer to a business question, this new professional is more concerned with what the question should be. The data scientist will look for new insights from data and will use it as a visualization tool not a reporting tool.

In future, many believe that having this type of individual on staff will also be key to generating maximum value from Big Data. In the meantime, the onus will invariably fall to the CIO to prepare and act for a changing Big Data landscape.

Customers can be assured that Teradata will continue to be their #1 strategic advisor for their data management and analytics. We continue to provide compelling and innovative solutions with Teradata Aster and Teradata IDW appliances. We will also work with best-in-class partners to provide choices in integrated solutions and reference architectures to help customers maintain competitive advantage with their data.

What is certain is that interesting times lay ahead, and that those enterprises that can successfully execute a Big Data strategy will gain competitive advantage from the valuable insights gained from Big Data Analytics.

Signing back on...

Donal Daly - Fri, 2012-04-20 06:27
Wow... Has it been nearly 5 years since my last post...

Well it has been an incredibly interesting and challenging time, Spent time with Informatica as VP of R&D for their Data Quality business. Met some great guys and learned a lot. Moved then to working for Pocket Kings (company behind Full Tilt Poker) as Director of Database systems. First time in my career on the customer/IT side of the fence. It was challenging, but working with very bright and committed people you learn a lot. It is also where I got the bug about Big Data through running an Aster Data nCluster 30+TB, 40+ nodes. After Teradata acquired Aster Data about a year ago I was lucky to get offered a position to spearhead the adoption of Aster Data in EMEA, both internally within Teradata and also with the Teradata customer base.

I am now living and working in the UK. I am still an Apple nut, still a Porsche nut, maybe still a nut period :-)

So what you can expect from this blog is my musings on Big Data and insight gleaned from working with customers to deliver business value from Big Data.  I still also probably post about my passion for Porsche, I'm lucky to have a tweaked (540 bhp) 996 Turbo as my daily driver.

Enjoy and I welcome your feedback...

Announcing Oracle Optimized Solution for Oracle Unified Directory

Mark Wilcox - Fri, 2012-04-20 02:03
I'm happy today to be able to share that we released an optimized solution for Oracle Unified Directory. It's one of the first public announcements we can make of several cool & useful things we've been working on. We have more coming from identity & access team. Which reminds me - for my loyal readers here - since December 2011 - besides covering directory - I am also now on the Oracle Access Manager Suite team. My colleague Sylvain post summed it up nicely what it is:Oracle Optimized Solution for Oracle Unified Directory is a complete solution - Software and Harware engineered to work together.It implements Oracle Unified Directory software on Oracle's SPARC T4 servers to provide highly available and extremely high performance directory services for the entire enterprise infrastructure and applications. The solution is architected, optimized, and tested to deliver simplicity, performance, security, and savings in enterprise environments. More details available at http://www.oracle.com/us/solutions/1571310 While that post is short - it is dense with information. So to explain it simpler - within Oracle we have a team (Optimized Solutions) who work with our product teams to show how our customers can get the best performance out of our hardware when running a specific software package. Instead of just giving you a generic tuning guide for our product - we've gone through the tuning steps and tested the configuration(s) for you. Thus besides giving you great performance - it's faster & simpler deployment because you can reduce the time it takes to run a tuning exercise from scratch. Optimized solutions simplifies that exercise because we've already done most (if not all) of the work for you. Click here to learn more about our Optimized Solution for Oracle Unified Directory.

Announcing Oracle Optimized Solution for Oracle Unified Directory

Mark Wilcox - Fri, 2012-04-20 02:03
I'm happy today to be able to share that we released an optimized solution for Oracle Unified Directory. It's one of the first public announcements we can make of several cool & useful things we've been working on. We have more coming from identity & access team. Which reminds me - for my loyal readers here - since December 2011 - besides covering directory - I am also now on the Oracle Access Manager Suite team. My colleague Sylvain post summed it up nicely what it is: Oracle Optimized Solution for Oracle Unified Directory is a complete solution - Software and Harware engineered to work together. It implements Oracle Unified Directory software on Oracle's SPARC T4 servers to provide highly available and extremely high performance directory services for the entire enterprise infrastructure and applications. The solution is architected, optimized, and tested to deliver simplicity, performance, security, and savings in enterprise environments. More details available at http://www.oracle.com/us/solutions/1571310 While that post is short - it is dense with information. So to explain it simpler - within Oracle we have a team (Optimized Solutions) who work with our product teams to show how our customers can get the best performance out of our hardware when running a specific software package. Instead of just giving you a generic tuning guide for our product - we've gone through the tuning steps and tested the configuration(s) for you. Thus besides giving you great performance - it's faster & simpler deployment because you can reduce the time it takes to run a tuning exercise from scratch. Optimized solutions simplifies that exercise because we've already done most (if not all) of the work for you. Click here to learn more about our Optimized Solution for Oracle Unified Directory.

SQL Tuning Advisor - 101

Barry McGillin - Thu, 2012-04-19 02:53
The DBMS_SQLTUNE package is the interface for tuning SQL on demand. Its Doc pages are here.  Have a look.  There is a lot of stuff to do to set a tuning task, run it, report on it and then get it to do something useful.  We've wrapped all that into our SQL Tuning Advisor function which means you dont need to start writing plsql API calls to make this work.  Stick in your dodgy query, click the advisor button and visualize the results.

Here's a look at how to do this.  Firstly, we need to grant a few permissions to our user. I'm doing this as sys.
 Then, for this demo, I want to clean out all the statistics on the tables I want to look at.
 Now, here's my initial query, getting total and mean salary grouped by departments.
When we then run the tuning advisor, a new tab appears on the worksheet which has the main results from the tuning sesstion.  This tab has four main sections to it. These are the statistics which the advior found on the objects in the query, changes to the profile which is in use, any indexes which need to be added.  Finally, if needed, there is a restructuring tab which may have some sql to help you restructure your query.


Finally, on the right hand side, we can see the details of tuning job.  The SQL_TUNE package generate text which we organise into the tabs.




We can see from the output above that the statistics are not available and the tool is recommending refreshing statistics on the objects in the original query.

We can then go and analyze the tables to see if that helps.


We can then check that the stats are fresh and at the time of posting, this is current.



Now, going back the tuning advisor and running it again, shows some different stats

Heres the final look at what the Tuning advisor tells us at the end of the second run.  This is the standard text output that comes from the tuning package

GENERAL INFORMATION SECTION
-------------------------------------------------------------------------------
Tuning Task Name : staName14054
Tuning Task Owner : HRDEMO
Tuning Task ID : 9295
Workload Type : Single SQL Statement
Execution Count : 1
Current Execution : EXEC_9255
Execution Type : TUNE SQL
Scope : COMPREHENSIVE
Time Limit(seconds): 1800
Completion Status : COMPLETED
Started at : 04/19/2012 07:33:50
Completed at : 04/19/2012 07:33:50

-------------------------------------------------------------------------------
Schema Name: HRDEMO
SQL ID : 028hrurkuc6ah
SQL Text : SELECT SUM(E.Salary),
AVG(E.Salary),
COUNT(1),
E.Department_Id
FROM Departments D,
Employees E
GROUP BY E.Department_Id
ORDER BY E.Department_Id

-------------------------------------------------------------------------------
FINDINGS SECTION (1 finding)
-------------------------------------------------------------------------------

1- Restructure SQL finding (see plan 1 in explain plans section)
----------------------------------------------------------------
An expensive cartesian product operation was found at line ID 2 of the
execution plan.

Recommendation
--------------
- Consider removing the disconnected table or view from this statement or
add a join condition which refers to it.

-------------------------------------------------------------------------------
EXPLAIN PLANS SECTION
-------------------------------------------------------------------------------

1- Original
-----------
Plan hash value: 2187233893


---------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
---------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 11 | 77 | 35 (3)| 00:00:01 |
| 1 | SORT GROUP BY | | 11 | 77 | 35 (3)| 00:00:01 |
| 2 | MERGE JOIN CARTESIAN | | 2889 | 20223 | 34 (0)| 00:00:01 |
| 3 | TABLE ACCESS FULL | EMPLOYEES | 107 | 749 | 3 (0)| 00:00:01 |
| 4 | BUFFER SORT | | 27 | | 32 (4)| 00:00:01 |
| 5 | INDEX FAST FULL SCAN| DEPT_ID_PKX | 27 | | 0 (0)| 00:00:01 |
---------------------------------------------------------------------------------------

Query Block Name / Object Alias (identified by operation id):
-------------------------------------------------------------

1 - SEL$1
3 - SEL$1 / E@SEL$1
5 - SEL$1 / D@SEL$1

Column Projection Information (identified by operation id):
-----------------------------------------------------------

1 - (#keys=1) "E"."DEPARTMENT_ID"[NUMBER,22], COUNT(*)[22],
COUNT("E"."SALARY")[22], SUM("E"."SALARY")[22]
2 - (#keys=0) "E"."SALARY"[NUMBER,22], "E"."DEPARTMENT_ID"[NUMBER,22]
3 - "E"."SALARY"[NUMBER,22], "E"."DEPARTMENT_ID"[NUMBER,22]
4 - (#keys=0)

-------------------------------------------------------------------------------

For doing this without SQL Developer, there are several things which you need to do. I have a little graphic which looks at each of the steps which need to be taken to create a tuning job in the normal SQL*Plus interface.  The main steps are creating task tuning tasks, and then interpreting the output.
Finally, this functionality is part of the SQL Worksheet in SQLDeveloper, which together with trace file editing, explain plan and autotrace, hierarchical profiler and monitoring sessions adds more tools to the toolbox for trying to find issues with you code.

Using Hierarchical Profiler in SQL Developer

Barry McGillin - Wed, 2012-04-18 08:30
One of the features exposed since SQL Developer 1.5 is the hierarchical profiler.  There have been several blogs and things about this but none I think that really get into the detail of what you are seeing and how to do it.

The hierarchical profiler allows you to see what happens when your piece of PL/SQL is running.  More specifically, it allows you to see where it is spending most of its time, which means you can concentrate on hammering that down, rather than wondering where to start.

For today, I'm using a really basic reference table with a few rows in it to allow us to do something.  I have also created two procedures, one of which calls the other so we have some nested dependencies.

drop table hier_demo;
create table hier_demo (id number, name varchar2(200));
insert into hier_demo values (1, 'Barry');
insert into hier_demo values (2, 'Lisa');
insert into hier_demo values (3, 'Rebecca');
insert into hier_demo values (4, 'Katie-Ellen');

CREATE OR REPLACE
PROCEDURE PRINTER(
NAME IN VARCHAR2 )
AS
BEGIN
dbms_output.put_line(NAME);
END PRINTER;
/
CREATE OR REPLACE
PROCEDURE SHOW_PEEPS
AS
CURSOR hiercur
IS
SELECT * FROM hier_demo;
-- hierrec hiercur%type;
-- type hiertab is table of hierrec%TYPE;
BEGIN
FOR myrec IN hiercur
LOOP
dbms_output.put_line(myrec.name);
END LOOP;
END;
/

Running the main procedure normally gives us some nice and simple out put.


When we click on the profile button in the plsql editor, SQL Developer will check that you have the proper permissions and the associated table to use the profiler.

When you hit the profiler button , it first comes up with the run dialog to set the parameters for the stored procedure.  Hitting ok on this diualog will run the stored procedure and any issues it has will also pop up while you are profiling.  


As this happens, the profiler  first checks to see if the there is a profiler log directory.  and if there is not one, it will prompt you to create one and get the appropriate permissions.






Hitting ok on this makes the tool then set up the directory for the profile.  To do this, it needs to run some SQL as sys to do it.


If the user agrees with all this, then he is prompted for SYS passwd to actually create the tables for the profiler statistics in the local user, in this case, hrdemo.
Finally, when they agree, the tool asks if it can setup a local set of tables for the profiler,  We'll agree to this too and make sure the profile is captured.

 Now, when we look at the profile tab of the PLSQL editor, we should have a new editor with the results of the profile.


This shows us a breakdown of the how the procedure actually executed all the way down to the actual fetch which returned the rows.  A very slight change to the procedure, in this case adding another procedure as a dependency which we also described above, we can show the nesting in the profile going further down.
create or replace
PROCEDURE SHOW_PEEPS
AS
CURSOR hiercur
IS
SELECT * FROM hier_demo;
-- hierrec hiercur%type;
-- type hiertab is table of hierrec%TYPE;
Begin
FOR myrec IN hiercur
Loop
PRINTER(myrec.name);
END LOOP;
END;

This now shows us that we have another profile in the set and clicking on it gives us the hierarchy of calls in the stored procedures execution.  The main point here is that we can now see the further level of indirection through the printer procedure.


So thats all of this profiler for now,  If you want to see how to do this with your own tables, the easiest thing to do is to download the Oracle Developer Days VM from OTN.  This particular blog will make an appearance as part of a bigger set later which we will discuss Tuning in general..

InteliVideo

Bradley Brown - Mon, 2012-04-16 23:50
Over the last few weeks, as I was working on my "next big thing," I started reflecting on the good, bad and ugly of prior start-ups and companies I've been involved with.  I started writing about building a start-up from the ground up - topics from raising money, pivoting, turds on the table, giving up control, keeping the day job and several more posts.

After being in the services business for over 20 years, I've thought for years about making money while I'm sleeping.  When I created my first Android app, I was so excited when I went from making $1 a day to $5 to $10 to $100 and beyond.  I loved how Google (and Apple) had created a marketplace that allowed you to write an application, put it into a market, decide on your pricing, and collect 70% of whatever price you set.  I had fun experimenting with pricing, features, etc.  I loved hearing the feedback and ideas from my customers.  I was making money while I slept!

At TUSC we provided Oracle consulting services, but we also provided training and education.  Early on I did a lot of training for our customers.  In the mid 90s I did intro (2 days) and advanced training (3 days) on SQR (a report writer for Oracle) nearly 2 months in a row.  I've done 100s of presentations at Oracle (and other) users groups.  I've written 100s of articles for user group newsletters to Oracle magazine.  I consulted for 100s of companies over the years.  But...none of these helped me make money while I slept...I had to be there, usually in person.  Then came the web and I started doing webinars.  I recorded many of those webinars over the years and we published them on our website.  But, I didn't monetize these webinars.

After many many hours of dreaming, working on the concepts, etc. I figured out my next big thing.  I decided to combine the concepts of the Android marketplace and webinars.  I decided to help people monetize their training and how to videos online.  In other words, if you create a webinar and you want to sell it, you can put it on the InteliVideo site, indicate how much you want to charge and we do the rest.  We sell the video, manage distribution to any device (mobile or browser), collect the money, deliver the content and you get 70% of the income.  It's a utility model.  No up front costs to you.  No storage costs.  No costs per byte.  No credit fees.  Simply upload a series of videos (i.e. a grouping you want to sell), tell us how much to charge (i.e. for a 1 day rental it's $20, for 3 days it's $50, etc) and sit back and wait for your monthly check to arrive.

When I first came up with the concept, I thought maybe I'd go out and develop a bunch of video content.  So I brainstormed about what I would develop first.  Oracle content?  I know people who know a lot about fishing, duck hunting, snowmobiling, wake surfing, home building, home fix up, and so many more topics...so I thought maybe I'd start there.  Then I realized that we didn't need more content, the world simply needs a video marketplace!

I'm also combining in something that was my first (paid) Internet project ever...developing a testing engine.  When I first learned Oracle Application Express (which I later co-authored a book on), I asked Larry (my co-author) to develop the testing application in ApEx.  Which he did.  I'm going to combine this functionality into the video side of the business as well.  There will be full event driven actions that you'll be able to take based on the answers.  For example, you can pause the video 3 minutes and 12 seconds in and ask a question.  If the viewer answers right, you can continue.  If they answer wrong, you can take the action you want to take - i.e. you might force them to start over or you might show them the answer in a small video.  Then at the end of the course, you can decide what you want to do based on their answers.  For example, as a part of continuing education, you might give them a certificate.  If it was a Meyers Briggs test that you were guiding them through, you might score them as an ESTJ and then up-sell them on videos about relationships for ESTJs.

What do I need from you?  Well, before long everything will be self service and I'll need your content.  So if you don't have content, please start working on it now.  If you already have content, I'd like you to be a beta customer!  I'll upload your videos, configure the site, develop the questions with you, etc.  You can either be part of the InteliVideo generic videos or you can have a white labeled site with your videos.  If you have a white labeled site, you can choose to include links for other partners (which you'll eventually get a sales commission if your site sells them) or if you want to exclude certain companies (i.e. your competitors).

So tell me...what would you like to see on the site?  There is a B to B side of this - i.e. my customers and the B to C side (i.e. selling videos to consumers).  What do you think I should tell each side?

You can view my SUPER alpha version of my "marketing" site, which I need to get all of the links working on...and you can view my generic and white labelled version of the site.

InteliVideo Marketing Site                                     InteliVideo Videos

By the way, please try out the social side of the site - i.e. enter a Facebook comment!  And, you can VOTE on the logo selection if you click on the link soon.


Pages

Subscribe to Oracle FAQ aggregator