Saturday, February 27, 2010

How To Traverse ADF Tree

If you are thinking how to delete nodes from ADF tree, there is Frank Nimphius blog post - ADF Faces RC: Single row / Multi row delete from a tree component. However, Frank says that his blog entry is still a raw diamond and needs some polishing. In his example, if you remove root node, all child nodes still will remain in database. This means tree hierarchy will be broken. I decided to polish raw diamond, to describe how you can traverse ADF tree and remove all selected nodes together with children.

Download sample application - TreeTraversal.zip. This sample implements ADF tree traversal algorithm without recursion. ADF tree component provides Java API to get parent node for current node, this allows to avoid recursion.

I have defined tree binding and created action listener method, where tree traversal code is implemented:


Tree traversal algorithm scans selected nodes, and walks through all child nodes. When it reaches last child in tree branch, it returns back until it finds next child sibling. At the end it will return to the initial node, where tree traversal was started - root node.

I'm using two helper methods, one to get first child and second to get next sibling. Its ADF tree Java API, nothing special:


Let's see how it works. We can select multiple nodes, children of all those selected nodes will be traversed:


I intentionally commented node.getRow().remove; in my sample, just to prevent you from all nodes deletion by mistake :-) Traversed nodes are reported in the log:


It starts from first selection and walks from 100 to 102, reports parent node 90 and goes to second selection, walks through and reports parent node 110.

You can select such node, which contains multiple branches:


Tree traversal will work as well - it will enter first branch, traverse it and move to next sibling branch. Whole report for previous selection:


If you are planning to traverse large tree branches, you should increase ADF tree RangeSize property value. Default value is 25, means it will keep only 25 nodes in memory, so you will not be able to traverse whole hierarchy. With default RangeSize it will render large tree hierarchy structures pretty slow, I recommend to increase it to something at least 500:


It will keep 500 nodes in memory, and will allow such tree operations as Expand All Nodes and Expand All Nodes Below to perform faster.

Wednesday, February 24, 2010

Integration in Oracle ADF with ADF Task Flows and Dynamic Regions

In my last post I described Dynamic Dashboard component, it allows to integrate separate applications using ADF regions. If we have fragments, we can drag and drop them easily into dashboard component as regions. You can read more about how to integrate ADF applications from my integration post series. Depending on requirements, you may want to implement dynamic regions and integrate your ADF applications into single main application. Well, I was thinking about this before, but it was not working. Until one fellow developer from sunny South Africa didn't logged it (thanks to him) on Oracle Metalink (SR 3-1354209241), and got reply. Well, I will describe today how you can use Dynamic Regions for ADF applications integration.

Download sample application - ADFIntegrationRegions.zip. This sample implements four different separate applications, all of them contain JSF Fragments and ADF Task Flows. Also there is one main application, where all four are integrated through ADF Regions included from ADF Libraries.
While talking with developers, I noticed often people don't understand how they should include ADF JAR libraries into project. So, if we have a list of libraries in Resource Catalog:


It is enough just to right click available library and choose Add To Project option (of course you should select ViewController project before). Or you can simply drag and drop ADF Task Flow into JSPX page and choose Create Dynamic Region option. Once libraries will be added, you should see ADF Library entry under imported libraries. In this example, ADF Library contains four JAR's:


Main application doesn't have any Model implementation, however because I'm using ADF connection for database, I must enable it anyway. Just go to Model Project Properties and enable it in Business Components section:


Ok, now let's talk about main thing - how to make it work with Dynamic Regions. When you drag and drop ADF Task Flow and create Dynamic Region, it asks you to create managed bean, where Dynamic Regions will be activated:


And here is main problem - it creates this bean in BackingBean scope. However, Dynamic Regions are not working well in BackingBean scope (check Oracle Metalink SR mentioned above):


In order to make it work, Managed Bean must be declared in Page Flow scope:


Yes, thats simple - just change generated bean scope and it starts to work. Integration itself works pretty simple, we have one region declared on the page:


This region is initialized dynamically from Managed Bean (Page Flow scope) and populated with Dynamic Regions based on user menu selection:


We have listed in the menu, four regions included from separate applications:


If Departments will be selected, Dynamic Region for Departments is opened:


Same works for Jobs:

Saturday, February 20, 2010

Dynamic Flying Dashboard UI Shell

My today post is next in ADF integration series - Integration. Sample application, I'm going to describe, is based on Dynamic Dashboard Demo code from Oracle ADF Faces Components demo (see Visual Designs).

In one of my projects, I got a design requirement to develop entire application with regions, practically using only one JSPX page. Different regions were implemented in separate applications and integrated into main application using ADF JAR libraries. Logically thinking, I should not develop any Dynamic Dashboard UI Shell, and just use ADF Dynamic Regions functionality. Unfortunately, in current JDev 11g PS1 release, we have problems with Dynamic Regions rendering and usage - see SR 3-1354209241 Oracle Metalink request logged by fellow developer. I decided to implement UI Shell that renders static ADF Regions, but in dynamic way.

Download sample application - ADFIntegrationDashboard.zip, this sample contains main application called DynamicDashboard, where ADF Regions from separate applications are integrated. I developed four separate applications with JSF Fragments and packaged them into ADF JAR libraries:


Those libraries are imported into DynamicDashboard application:


You can see that Model project in DynamicDashboard is empty, thats correct - because I'm using it only to integrate content from another applications. ViewController contains one JSPX page, where Dynamic Dashboard is implemented, and Toolbar declarative component with toolbar elements. You will find DemoDashboardBean there as well, its where all drag and drop logic is implemented. Same bean controls minimize, maximize and restore operations applied for Dynamic Dashboard elements.

I'm not going to describe all technical specifics for Dynamic Dashboard, its code is quite straightforward. I took Oracle example from Oracle ADF Faces Components demo and removed things I didn't wanted to have, now it is more lightweight and optimized for ADF Regions integration. I rather will focus on functionality.

Here you can see main Dynamic Dashboard screen - four integrated ADF Regions, placeholder to keep minimized regions on the left and New Job, Job Distribution, Assigned Jobs, Show All and Shown None links on the top. These links allow to show predefined sets of regions.


If user wants to hide any available region, he can just take it and drop on the left. Locations and Employees regions are minimized now, only Departments and Jobs are available:


User can maximize displayed region and then restore to original view:


Depending on a use case, Dynamic Dashboard potentially can replace UI design with different tabs.

Sunday, February 14, 2010

Optimizing Oracle ADF Application Pool

If you was reading my previous post - Monitoring Data Source Connection Leaks, you got clear idea how you can monitor data source connection pool usage by Oracle ADF application. Basically speaking, Oracle ADF application works with two pools - Application and Database. First pool allows to optimize application work on middle tier, while second pool cares about database connections. Today I will describe how you can control JNDI data source connection pool.

As a reference, please check Steve Muench article - Understanding Application Module Pooling Concepts and Configuration Parameters. This post will be based mostly on Application Pool Cleanup Parameters section. Download sample application with optimized data source connection pool usage - DatabaseConnectionLeak.zip.

Main topic I will talk today is how to control data source connection pool grow. Most of ADF 11g applications are developed using JSF Fragments. Usually separate JSF Fragments are based on separate Application Modules. This means, if main page contains 10 JSF Fragments, 10 Application Modules will be involved during main page rendering. If there will be 10 users, there will be 10 x 10 = 100 connections created in data source connection pool. I don't want to say its not good to have many Application Modules, its good - work load will be distributed across different Application Modules. What is not good - by default, data source connections are removed back to available pool after really long time. Systems were we have many concurrent users can run out of available data source connections.

First thing WebLogic administrator will do is to enable Inactive Connection Timeout, hoping this will return inactive data source connections back to available pool. I have set it to 60 seconds:


My system contains two Application Modules, both are used from the same main page:


When user opens main page, from data source connection monitoring, we can see following picture:


Connection Pool Size increases up to 2 connections - thats correct, because main page involves two Application Modules. Connections in Use pool contains two active connections. As mentioned above, Inactive Connection Timeout was set to 60 seconds, and after 60 seconds we can see both reserved connections were removed from Connections in Use pool. Thats correct, but same time Connection Leaks were reported. This means, we can't use WebLogic Inactive Connection Timeout option to manage data source connections. ADF application keeps data source connection and WebLogic is removing it by force - its why leaked connections are reported. You can set Inactive Connection Timeout to be triggered once per day, to ensure any really leaked connections are removed.

What we could do is to play with Application Pool parameters. Connection Pool section is disabled, because we are using JNDI Data Source defined on WebLogic:


I'm interested in 3 parameters from Application Pool:

  1. Idle Instance Timeout (10 minutes default)
  2. Pool Pooling Interval (10 minutes default)
  3. Maximum Instance Time to Live (available through Properties tab, 1 hour default)

For such systems, where we have many concurrent users with short requests, default values probably are not suitable. Let's change Idle Instance Timeout to 1 minute and Pool Pooling Interval to 0.5 minute (same as you can see on screenshot above). This will affect Application Pool, but not data source connection pool - reserved two connections will remain as Connections in Use, even when there will be no active users online:


Such behavior is not acceptable and will expand data source connection pool very fast, this will block your application.

There is Maximum Instance to Live property, it allows to control how long Application Module instance should be available in Application Pool:


Default value is 1 hour, while it is acceptable, in most of the systems it will be too long. While Application Module instance will be alive, it will keep opened database connection in use. Yes, new users will reuse Application Instances available in Application Module pool and this will prevent database connection grow. However, where we have many concurrent users and many Application Modules involved, 1 hour for Maximum Instance to Live can be too long.

You can test Maximum Instance to Live with 2 minutes time. I have enabled it only for one of my Application Modules.  Now we see such picture - 2 database connections reserved and 1 removed back to available connections pool after 2 minutes:


After 2 minutes, user comes back to his screen and makes some action related to data refresh:


Second data source connection is reserved again and remains reserved in 2 minutes time after last user activity:


While Idle Instance Timeout and Pool Pooling Interval will help you to optimize Application Pool, Maximum Instance Time to Live property will help to optimize entire application workload.

Wednesday, February 10, 2010

Monitoring Data Source Connection Leaks

When it comes to production, often you can face problems with growing number of Data Source connections. I see this as frequent problem - WebLogic admins complain that ADF application is using 1000 Data Source connections, on other hand ADF developers postpone code quality to be checked after deadline :-) And then they start to blame ADF, no need for this, just monitor your system and double check your code.

Today I will demonstrate how you can monitor  Data Source usage with Oracle Enterprise Manager 11g. I will show Data Source connection leaking case, in my next post I will tell you the reason, why you can have leaked Data Source connections in ADF (it will be development bad practice example).

Among many performance monitoring options available in Oracle Enterprise Manager 11g, you can monitor your application Data Source. I prefer to check 4 parameters:

  1. Connection Pool Size changes
  2. Available Connections changes
  3. Number of Connections in Use
  4. Connection Leaks

You can see that HrDSDS Data Source current Pool Size is 1, there is 1 connection available and 0 connections in use. No connection leaking at the moment as well:


As a next step, I opened my ADF application. It is immediately reflected in Data Source graph - 0 connections available, 1 connection in use:


In order to optimize Data Source usage, you need to enable Inactive Connection Timeout option in WebLogic Console. With this option enabled, WebLogic will try to return inactive reserved connections from Connections in Use pool to Available Connections pool. This will prevent Connections Pool Size grow. Just set positive number of seconds:


In my case, I set it to 30 seconds, it is because I'm monitoring application for testing purpose. You should set it to longer period, may be 600 seconds.

Most important, in that case if developer will forget to close Data Source connection or because ADF development bad practice - WebLogic will report exception when trying to remove leaked connection back to Available Connections pool. Leaked connection will be returned to Available Connections pool forcibly. In my example you can see that after 30 seconds, leaked Data Source connection was reported, but still it was removed from Connections in Use pool and returned back to Available Connections pool - means it will be reused later:


How you can access Data Source monitoring options? Just open Performance Summary screen for that domain, where ADF application is deployed:


From Metric Palette expand Datasource Metrics node, and there you will find monitoring options for your Data Source:

Sunday, February 7, 2010

WebCenter Suite 11g and ADF 11g Consulting in Middle East

Recently I was busy in Middle East, consulting and deploying production system based on WebCenter Suite 11g and ADF 11g technologies. This system will have public access, expect URL posted soon ! :-)

Nature is beautiful here - Red Sea coast:

Saturday, February 6, 2010

Custom Attribute to Pass ADF Button Key

I got a question, based on my previous post - CRUD Operations in Oracle ADF 11g Table Using PopUp Component. Blog reader was asking, if its possible to identify ADF button component in Backing Bean without checking component Id. Answer is yes, it is possible - just need to use JSF attribute component. You can declare JSF attribute for ADF button and pass button key through this attribute:


It will be possible to access value passed through JSF attribute in Backing Bean. You just need to specify JSF attribute name and value. In this example, I will use insert value in order to identify Insert button:


In Backing Bean, you will need to access UI Component (ADF button in this case) object, and use getAttributes() function to get a map of available attributes. Retrieve defined JSF attribute using its name:


In a case, if Insert button will be pressed, we will check attribute and will invoke CreateInsert operation:


In other case - row will be opened for editing:


Download sample application - TableDialogEdit3.zip.

Thursday, February 4, 2010

Default Value for Date Field in Query Criteria

I got a question about how to assign default value for Date type field available in Query Criteria. Developer had a requirement to initialize Date type query field with date equal to 10 days before current date. This requirement is very simple for ADF and can be implemented in 5 minutes without writing any code.

Download sample application - DateSearch.zip. This sample returns default value for HireDate attribute on page load automatically:


In order to implement this requirement, open View Criteria definition and assign Bind Variable parameter to HireDate attribute. Its very important, make sure you uncheck Ignore Null Values checkbox:


This will initialize Bind Variable with Null value, when user will try to search with empty HireDate attribute value. Otherwise you will get Missing In and Out parameters SQL error.

For getting correct default date value, you can use Groovy language for Bind Variable default value expression:

Monday, February 1, 2010

Installing ADF/WebCenter 11g on 64-bit Platforms

Today we were creating WebLogic cluster on two machines for ADF/WebCenter 11g application installation. Hardware is 64 bit, each with 8 processor cores and 32 GB RAM - sounds pretty powerful. There was no problems with clustering and deployment, however we faced problem while testing WebCenter Spaces - welcome page just wasn't compiled and was throwing following error, while trying to access it:

Servlet failed with Exception
java.lang.StringIndexOutOfBoundsException: String index out of range: -2

At first, we thought may be WebCenter Suite 11g runtime is not working on 64 bit platforms. So, we uninstalled all servers and did fresh install using only ADF 11g runtime libraries. We deployed simple ADF 11g application and got same exception. It became obvious, that its not WebCenter Suite 11g problem.

After some brainstorming, and couple of tea/coffee breaks, we understood that 64 bit JDK from Oracle (Sun) we were using is newer comparing to that 32 bit JDK packaged together with Oracle WebLogic Server installation. We were using latest jdk160_18, while Oracle WebLogic is packaged with 32 bit jdk160_14 version. So, we downloaded 64 bit jdk160_14 and reinstalled cluster again. This time ADF/WebCenter 11g deployment was working without any problems on 64 bit cluster environment.

Lesson learned: When installing Oracle WebLogic on 64 bit platform, you need to download Oracle WebLogic installation that comes without Java. Make sure you will use same 64 bit Java version, as it is used in packaged Oracle WebLogic installs with 32 bit Java.