Vgo Software

Entries in adf (5)


The importance of naming your cookies

Cookies are old hat for many of us, in use in all types of applications.  What you might not realize is that many Java frameworks, including ADF will by default use the same cookie name for the session identification.  This is all well and good if only one such application is using that domain, but if multiple ADF applications are sharing the same domain, they can clobber each other's sessions.

Once client encountered this when a user started reporting their application session timing out after what seemed to be really short periods of time.  It turned out to be two ADF applications that were using the default names for their cookies (JSESSIONID). 

Thankfully, in Weblogic, this is easy to fix.  In JDeveloper, you can open the weblogic.xml file in your ViewController project and click on the session section on the left hand side.  Open the Cookies panel and provide a Cookie Name, save this file and you are all set.

The source would look like this:




That's all there is to it.

BTW, we encountered this again when deploying a couple older ADF applications in a clustered environment.  If we used the apache weblogic plugin to do load balancing we were encountering some strange issues.  It turns out that unless you supply a cookie name for the plugin, it also uses JSESSIONID and one of the older applications that hadn't been udated with a cookie name was overwriting it, causing our session to be lost and our browser could no longer reach the application.  It kept receiving timeout notices.  Adding a cookie name in the configuration fixed the problem.

Lesson learned: never underestimate the value of naming your cookies!



Oracle's Java as a Service

I've been in San Francisco at Oracle Open World 2013 for the past few days.  It's been a great conference so far with some interesting sessions, some America's Cup racing, and lots of exhibitors to see.

One of the most interesting items to be announced by Oracle at Open World this year was their new Cloud offerings.  They've been talking a lot about the cloud for the past couple of years but it seems this year they are finally releasing some interesting products based on the cloud.

Not only do they now have an Infrastructure as a Service offering, much like Amzaon's EC2 or Microsoft Azure, but they now have a Database as a Service offering and a Java as a Service offering.

This being a Java-focused blog, the Java as a Service product is what I found most interesting and I was actually able to attend a Hands-On-Lab that demonstrated the service.  The usage is very simple, it basically exposes a managed Weblogic Server for you to deploy to.  If you want to have fine-grained control over the environment, this isn't for you, as the only control you have is what you can accomplish via the Weblogic Management Console and the interface to the JaaS itself.

As far as being able to quickly get a application up there and running, it couldn't be simpler however.  The lab demonstrated 3 different ways to do it, one with an IDE plugin (we used Eclipse, I believe there are others), one via the command line, and one via Maven.  You can also deploy and undeploy using the Weblogic console itself.

Most likely you will need more than a JaaS instance if you wanted to do anything worthwhile, for instance you will probably want a database for your application and you will need to use their DBaaS for that, but it does let you easily deploy and run Java applications (including ADF applications!) to the cloud.

Check it out here!


ADF View Objects with Dynamic Where Clauses

While working on a recent modernization project I came across a scenario in which a View Object used by a page had a different where clause depending on one of the values passed in.  All in all, not too big of a deal, but because the where clause used a number of parameters and the parameters used changed along with the where clause it introduced some complications.  So to save you the trouble I thought I'd write about a couple of them here.

First of all, it is easy enough to to create a method in an Aplication Module implementation class that adjusts the where clause to what you need using the ViewObject.setWhereClause(String whereClause) method.  To set the parameters you need to use the ViewObject.setNamedWhereClauseParam(String paramName, Object paramValue) method.  What I found was that I had to have the parameters set at Bind Variables in the View Object and they must have their required property set to true, otherwise they are not really bind variables.  

What that means is that to execute the query, it must contain all of the bind variables even if they are not being used.  That means putting them in the where clause all the time.  To accomplish this for bind variables that were not used, I just put something like "and :par1 = null" and set the value to null using the above method.

This works fine except that since the bind variables are not in the original query definition another problem was encountered.  As soon as the task flow opened, the first thing it had to do was run the method I defined in the Application Module, unfortunately that did not happen.  Instead the first thing that happened was that the View Object executed and returned an error because the bind variables were not in the query.

Apparently in order for my method to work, I needed a page definition file that allowed my method to access my View Object's iterator.  Because this iterator was in the page definition file, ADF executed the query upfront even though nothing that used it was being called yet.  In order to fix this I set the refresh property of the iterator in the page defintion to "never".  This means that the iterator will not execute unless refresh() or execute() are called programattically on the View Object.  Of course this happens in the Application Module method that sets up the where clause after the where clause has been defined with the parameters in it.

That's it, simple enough once you realize what is happening, but a little frustrating before that light bulb goes on!


ADF Release 2 - Problem Clearing Cache Resolved

I found this exception after upgrading to Release 2 of ADF with an application that was previously working in Release 1.

This application is based on a previously exisitng forms application and as such, it depends upon a lot of stored procedures and even a lot of database triggers.  We had a situation where, when creating a new record, database triggers in the database created a bunch of rows in other tables, those records then needed to be displayed on an edit screen for the newly created object.  Once the Entities and ADF View Objects were all set up it all works smoothly if we could get the newly created records from the DB.  In order for that to work, the application was calling this.getDBTransaction().clearEntityCache(null) in the application module.

The clearEntityCache(String arg) method is supposed to clear the cache for the object with the given name or all of the cache if the passed in argument is null.  In Release 2 of ADF this was causing an exception to be thrown.  The exception was a NullPointerException in the sendOrQueueEventForSharedTrans method of DBTransactionImpl.

The fix that was implemented was to change the code to this.getDBTransaction().setClearCacheOnCommit(true); before the commit was performed and remove the clearEntityCache(null) call that coming after the commit.


Another JBO-25030 Problem

Chris Muir has a good article explaining a typical JBO-25030: Failed to find or invalidate owning entity problem.  Recently, however, I came across another atypical reason for the same error.

In this particular instance, converting the logic from an exisitng form to an ADF application, there was a master-detail relationship on the page.  When the user created a new master record and then attempted to create a new detail record, the JBO-25030 exception would be thrown.  What I found when investigating this was that if the user saved the master record first, then added a detail record, everything would work as expected.

Investigating further revealed that there was a primary key on the master record that was not updateable by the user.  It was generated by a trigger on the database table.  Since the primary key could not be validated when the child record was to be created, the exception was thrown.

This type of issue is easily overcome by putting the logic to get the new primary key (in this case from a sequence) in the ADF Code and removing the table's trigger.