Development

The closer you look....

FeuerThoughts - Mon, 2014-06-16 09:45
In the last couple of years, I have shifted my attention away from the human condition (wars here and there, cool new gadgets, etc.) to the non-human condition: the natural world of trees, water, creatures large and small, the process of evolution.

Along the way, I have been reminded that what you pay the most attention to is what your brain spends the most time thinking about (at least the parts of my brain that "I" am "conscious" of). So I need to be careful about what I pay attention to (one reason that I have stopped watching television almost completely). 

And spending ten plus hours a week outdoors, in the woods, cutting back invasives and rescuing trees, has reinforced this to me:

With living things, the more I watch and more closely I watch (and smell and taste), the more amazed I am by the wonders of life. And the more alive I feel,

With manufactured things, it is just the opposite.

The more closely I look at something made by humans, the more sterile, dead and energy-sucking it appears. And the more I watch (or smell or taste), the more deadened I feel.

Perhaps this is not such a big surprise, since everything that humans make is dead, and built upon the deaths of many creatures. Sorry if that sounds like such a downer, but I believe it is simply a statement of fact.

Anyway, no need to feel down. Just go outside, into the trees, into a field, away from things we make, take a deep breath, feel the sun on your face....and you will feel much better.
Categories: Development

Telling Tales

Greg Pavlik - Sun, 2014-06-08 18:50
After struggling to find time for many months, I finally was able to sit down and watch without interruption Benjamin Bagby's Beowulf performance - an adaptation that relies on Bagby's voice and a reconstruction of a 6th century 6 tone Anglo-Saxon harp. The performance is engrossing and provokes a strong imaginative response, one that would have been communally experienced. Of course the only way to revive a sense of communal experience in the case of Bagby is to see him perform live - however, given the performance is entirely in Old English and as such most unintelligible without subtitles, I think a digital adaptation may be a necessary tradeoff. In many ways, Bagby's Beowulf is a reminder of how impoverished our notion of entertainment is - ephemeral, base, isolating and essentially throw away as a rule.

By the way, it's not entirely the case that the English are unable to create something of the same texture today - several times during the performance I thought of Judith Weir's one person, unaccompanied opera King Harald's Saga. Weir's work is much shorter, principally a musical composition and less poetically rich, so it is difficult to compare the two directly: Beowulf remains the provenance of a balladeer first and foremost, and this is a genre that more and more feels lost to our world - poetry today rarely seems to be meant to be read allowed and even more rarely follows epic formats. This is a lost social phenomena, for which we are impoverished: in fact, the last long work of a balladeer I read was Ethiopian Enzira Sebhat, itself a medieval work dedicated to the Virgin Mary. The closest - though only indirectly comparable - works to the Enzira Sebhat that I am aware of currently being composed are akathistos hymns of the Russian Orthodox tradition. And while many of those recent compositions are less-than-accomplished literary works, they unquestionably represent a rich and living and at times very beautiful means of transmission of communal memory and values. I am not aware of any recent akathistos compositions that have the expressive beauty and originality of the Byzantine hymnographer Romanos the Melodist, the modern akathist has sometimes proven a source of inspiration for exceptionally great art: the late Sir John Tavener's setting of the "thanksgiving akathist" being perhaps the most significant case in point.

Memories of the way we were...

Greg Pavlik - Sat, 2014-05-31 16:13
The fascinating thing about Hadoop is the obviousness of its evolutionary needs. For example, MapReduce coupled with reliable scale out storage was a powerful - even revolutionary - effect for organizations with both lots of and multi-structured data. Out of the gate, Hadoop unlocked data "applications" that were for all intents and purposes unimplementable. At the same time, it didn't take much imagination to see that separating the compute model from resource management would be essential for future applications that did not fit well with MapReduce itself. It took a lot of work and care to get YARN defined, implemented and hardened, but the need for YARN itself was fairly obvious. Now it is here and Hadoop is no longer about "batch" data processing.

Note, however, it takes a lot of work to make the evolutionary changes available. In some cases, bolt on solutions have emerged to fill the gap. For key value data management, HBase is a perfect example. Several years ago, Eric Baldeschwieler was pointing out that HDFS could have filled that role. I think he was right, but the time it would take to get "HBase-type" functionality implemented via HDFS would have been a very long path indeed. In that case, the community filled the gap with HBase and it is being "back integrated" into Hadoop via YARN in a way that will make for a happier co-existence.

Right now we are seeing multiple new bolt on attempts to add functionality to Hadoop. For example, there are projects to add MPP databases on top of Hadoop itself. It's pretty obvious that this is at best a stop gap again - and one that comes at a pretty high price - I don't know of anyone that seriously thinks that a bolt on MPP is ultimately the right model for the Hadoop ecosystem. Since the open source alternatives look to be several years away from being "production ready", that raises an interesting question: is Hadoop evolution moving ahead at a similar or even more rapid rate to provide a native solution - a solution that will be more scalable, more adaptive and more open to a wider range of use cases and applications - including alternative declarative languages and compute models?

I think the answer is yes: while SQL on Hadoop via Hive is really the only open source game in town for production use cases - and its gotten some amazing performance gains in the first major iteration on Tez that we'll talk more about in the coming days - its clear that the Apache communities are beginning to deliver a new series of building blocks for data management at scale and speed: Optiq's Cost Based Optimizer; Tez for structuring multi-node operator execution; ORC and vectorization for optimal storage and compute; HCat for DDL. But what's missing? Memory management. And man has it ever been missing - that should have been obvious as well (and it was - one reason that so many people are interested in Spark for efficient algorithm development).

What we've seen so far has been two extremes available when it comes to supporting memory management (especially for SQL) - all disk and all memory. An obvious point here is that neither is ultimately right for Hadoop. This is a long winded intro to point to two, interrelated pieces by Julian Hyde and Sanjay Radia unveiling a model that is being introduced across multiple components called Discardable In-memory Materialized Query (DIMMQ). Once you see this model, it becomes obvious that the future of Hadoop for SQL - and not just SQL - is being implemented in real time. Check out both blog posts:

http://hortonworks.com/blog/dmmq/

http://hortonworks.com/blog/ddm/


Do animals have souls?

FeuerThoughts - Thu, 2014-05-29 09:04
OK, first of all, don't tell me your answer to this question. That would make the rest of this post seem a bit rude.

Here is one of the dumbest questions I can ever imagine a person asking, much less answering:

Do animals have souls?

How utterly ridiculous.

No one knows what a soul is. No one knows what it looks like, what it means, whether or not it really exists.

Furthermore, we certainly have no idea - please allow me to repeat that because I think it is so fundamental to accept this as fact: we have no idea at all - of what is going on inside an animal’s head. Clearly, a whole lot is going on, if you take the time to pay attention to animals and think about what it takes to do what they do. But many of the things humans blithely state as fact regarding animals, such as “They don’t know the difference between right and wrong.” is fundamentally meaningless because we simply cannot know what is going on inside another creature’s mind. We just make the assumption that they are really super different from us in all the ways that matter - to us.

We are intelligent, moral, sentient. We are smart and they are dumb, brute animals. We are conscious, we have history, philosophy, nuclear power. What do animals have? Nothing!

Oh really? How do we know what animals have? Or even what “have” means to a butterfly or a snake or a black bear? Again, we really have no idea whatsoever what animals have, what they want, or how they would feel about killing others just to make themselves comfortable (something that we humans do every second of every day).

So we make the most self-serving assumption imaginable. We simply outright declare that other creatures have no souls, are not sentient. They are food or threat or benign, but they are not like us.

We will continue to reject the evidence of our senses, the clear demonstrations of sentience, of complex social structures, in other animals. That way we don’t have to feel bad about enslaving them and killing them. Think for just a moment about how smart pigs are, and then think about pig farms in which tens of thousands of these poor creatures live short miserable lives - brought into this world for the very purpose of slaughtering them for bacon. And then later a dam bursts and an entire town is swamped with pig feces from the refuse lake at the farm. Go, humans, go!

I sure am glad there wasn’t and isn’t a species of creature on this planet that's three times our size, extremely powerful and licking its lips at the prospect of a nicely smoked human torso. 

We do not know what goes on inside a pig’s head, but it sure seems like they can feel and express terror. 

So, yes, humans will keep on keeping on, keep on consuming, reproducing, and assuming. But that doesn't mean we can’t try to recover a shred, a mere shred, of our individual dignity by at least acknowledging what we are doing, and taking at least one step, no matter how small to help heal our planet and our co-inhabitants.

We can start by acknowledging, accepting, that the thing that we believe makes us unique and special among all living things is simply an unknowable assumption we make. It is an arbitrary, self-serving action - and brings into question the very idea that humans can be considered moral creatures. 
Categories: Development

Configure Coherence HotCache

Edwin Biemond - Tue, 2014-02-04 22:29
Coherence can really accelerate and improve your application because it's fast, high available, easy to setup and it's scalable. But when you even use it together with the JCache framework of Java 8 or the new Coherence Adapter in Oracle SOA Suite and OSB 12c it will even be more easier to use Coherence as your main HA Cache.  Before Coherence 12.1.2 when you want to use Coherence together with

REST, SSE or WebSockets on WebLogic 10.3.6

Edwin Biemond - Wed, 2014-01-15 14:10
WebLogic 10.3.6 comes with Jersey1.9 and has no support for Server Side Events or WebSockets. But for one of our projects we are making a HTML5 / AngularJS application, which need to invoke some RESTful services and we also want to use of SSE or WebSockets. Off course we can use WebLogic 12.1.2 but we already have an OSB / SOA Suite WebLogic 10.3.6 environment. So when you want to pimp your

new Puppet 3 Weblogic provisioning module

Edwin Biemond - Sun, 2013-11-24 14:22
The last few weeks I was busy re-writing of my puppet WLS module so it fully supports the power of Puppet 3 (thanks for more than 4000 downloads on puppet forge and all the github downloads). With Puppet 3 we now can use Hiera, Iterations and Lambdas expression. This does not sound like a big change but with Hiera and the new Puppet Language features, I can define big WebLogic Domains without

Creating your own Virtualbox Development Images

Edwin Biemond - Sat, 2013-11-16 15:11
For my Oracle Puppet provisioning development I can't do without these create image tools: Packer and Vagrant in combination with Oracle VirtualBox or VMware.  In this blogpost I will explain what these tools can do for you and how you can make your own images and use puppet as provisioning tool. With Vagrant you can create your own virtual images and it can start puppet or chef to do all the

The road ahead for WebLogic 12c

Edwin Biemond - Sat, 2013-10-19 13:07
Before we can describe all the new features of WebLogic 12.1.3 & 12.1.4 and compare this to the 12.1.2 version we should first take a look at the 10.3.6 version. WebLogic 10.3.6 is still the latest 11g version but Oracle will support 10.3.6 till 2018 and extended support till 2021. So Oracle’s Fusion Apps and we have enough time to migrate to WebLogic 12.1.X. Oracle also promised that the upgrade

Custom Jersey WADL generation

Edwin Biemond - Thu, 2013-08-22 14:46
I had a situation where the auto generated WADL did not match with my Rest services. The first difference was that the response is presented as an object instead of a collection of objects and the second one is that it could not handle JSONWithPadding as response.  Because I use this WADL in my Rest client generation, I need to fix these issues. Lucky for me, Jersey JAX-RS allows us to provide

Coherence 12.1.2 Rest application build with OEPE

Edwin Biemond - Sun, 2013-08-11 16:20
With WebLogic 12.1.2 Oracle also released a new version of Coherence and OEPE. The 12.1.2 release contains many new Coherence features like WebLogic Managed Coherence Servers and Coherence Grid Archive ( GAR ) which can be included in an normal EAR. Coherence also has some nice new REST features like direct & named queries,  Custom Query engines and new Security options. Plus with OEPE you can

JAX-WS SOAP over JMS

Edwin Biemond - Thu, 2013-08-01 07:39
With WebLogic 12.1.2 Oracle now also supports JAX-WS SOAP over JMS. Before 12.1.2 we had to use JAX-RPC and without any JDeveloper support. We need to use ANT to generate all the web service code. See this blogpost for all the details. In this blogpost I will show you all the necessary JDeveloper steps to create a SOAP over JMS JAX-WS Web Service  ( Bottom up approach) and generate a Web

Maven support in WebLogic & JDeveloper 12.1.2

Edwin Biemond - Fri, 2013-07-26 09:32
In the 12.1.2 release of JDeveloper and WebLogic, Oracle really improved the support for Maven as build and provisioning tool. Oracle did this on multiple levels: an Utility to synchronize all the Oracle Middleware jars to a local ( .m2/repository) or a shared repository like nexus or artifactory ojmake maven plugin for just building JDeveloper projects. Updated its Weblogic plugin for

JDeveloper 12.1.2 EJB & Java Service Facade Datacontrol

Edwin Biemond - Thu, 2013-07-25 07:13
With JDeveloper 12c (12.1.2)  Oracle added some great updates to the ADF EJB / Java Facade Datacontrol. Oracle already added the Java Facade and Named Criteria support in its previous versions of JDeveloper but now also added the so wanted List of Values feature, just like we have in ADF BC.   In the the next part I will show you all the new features and some LOV bugs ( not everything is

Custom OSB Reporting Provider

Edwin Biemond - Tue, 2013-06-11 15:53
With the OSB Report Action we can add some tracing and logging to an OSB Proxy, this works OK especially when you add some Report keys for single Proxy projects but when you have projects with many Proxies who are invoking other JMS or Local Proxies than the default reporting tables (WLI_QS_REPORT_DATA, WLI_QS_REPORT_ATTRIBUTE ) in the SOA Suite soainfra schema is not so handy. I want to

Build and Deploy OSB projects with Maven

Edwin Biemond - Tue, 2013-04-30 12:31
2 years ago I already did the same with ANT and now I migrated these scripts to Maven. These Maven poms can still do the same like my ANT scripts. Build and deploy an OSB OEPE workplace Build one OSB project. Export OSB projects from an OSB server and generate a customization plan. Here you can find my code https://github.com/biemond/soa_tools/tree/master/maven_osb_ps5 or the PS6 version https:

Offline Oracle Service Bus Configuration export

Edwin Biemond - Tue, 2013-04-30 12:24
With Oracle Service Bus PS6 or 11.1.1.7 we finally have an new offline build tool with does not require Eclipse (OEPE).  With this OSB configjar tool ( located in the OSB home /tools/configjar/ folder ) you can make OSB export sbconfig jar based on 1 or more OSB projects or even with more then one OSB Configuration projects. Plus have total control what to include or exclude. In this blogpost I

Token Configurations in Oracle SOA Suite PS6 ( 11.1.1.7.0)

Edwin Biemond - Sat, 2013-04-20 13:11
Oracle Soa Suite PatchSet 6 or 11.1.1.7.0  now has support for Token configurations which really can help administrators in configuring or overriding external Web Service Reference parameters like the protocol ( http, oramds or https ),  the remote host etc. And more important if you do it right you can kick out the all those Soa Suite deployment plans :-) Important to know this only works on

Testing Activiti BPM on WebLogic 12c

Edwin Biemond - Fri, 2013-03-29 15:33
Activiti is a great open source workflow + BPM platform, which you can use in your own java application (embedded) or test it in the provided Rest or Web demo applications. Activiti also provides  an Eclipse designer plugin which you can use to create your own BPMN 2.0 definitions and export this to the Activiti applications. In blogpost I will show you the steps how to get this working on the

Pages

Subscribe to Oracle FAQ aggregator - Development