Skip to main content

How to create ADF components on runtime?

Quite easy actually! There are two methods by which we can do it. Either we can create the component in the managed bean and add it to the children of an existing component, or we can create it in our .jspx or .jsff page directly.

1) Creating component in the Managed Bean.

In the below code we are creating a new ShowDetailItem which we will add to an existing panelTabbed item.

private UIComponent createComponent() {
UIComponent componentToAdd = null;

//Create new object of ShowDetailItem and set its properties.
RichShowDetailItem item = new RichShowDetailItem();
item.setDisclosed(true);
item.setText("new tab");
componentToAdd = item;

//Now that we are at it, I am creating a new iFrame which will be set inside the ShowDetailItem
RichInlineFrame frame = new RichInlineFrame();
frame.setSource("http://oracle.com");

//add the iFrame to the children of the ShowDetailItem
componentToAdd.getChildren().add(frame);

return componentToAdd;
}

Now calling this method from the event listener on which I want to add the ShowDetailItem:
....
....
RichPanelTabbed mainPanel = getMainPanelTabbed();
UIComponent componentToAdd = buildComponent();
mainPanel.getChildren().add(componentToAdd);
....

And we are done! Now when ever this event listener is invoked, you will see a new tab in your panelTabbed layout.

2) Second way is also very easy and preferred if you have to create more than one components of same type. So if I want to create multiple tabs at run time for my panelTabbed component, I should use this method.

This is by using <af:iterator>> in your page. See the snippet below:

<:iterator var="row"
value="#{bindings.selectedFormsIterator.allRowsInRange}">
<af:showDetailItem text="#{bindings.selectedFormsIterator.currentRow.dataProvider.formName}">
<af:inlineFrame source="http://someurl.com/test.jsp?Form=#{bindings.selectedFormsIterator.currentRow.dataProvider.formName}" />
</af:showDetailItem>
</af:iterator>

Basically af:iterator is a modified version of af:forEach and is suggested for iteration if you are trying to create multiple components.

You just need to create the iterator binding and associate it with your iterator and you are ready to go!

Simple isn't it?

Comments

Unknown said…
I want to create a dynamic menu.I used the adf iterator with commendMenuitems.Now How can i add action to these menuitems??.In the page structure there is only one commandmenuitem,but during runtime i get many menuitems.

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA...

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a...

Changing Eclipse Workspace Directory

Recently I moved my entire Eclipse installation directory but the workspace was still getting created in the older location only. And worst there was no option to select the Workspace directory in the Window->Options->Workspace menu. To change the workspace location in Eclipse do this. Goto ECLIPSE_HOME\configuration\.settings directory, edit the org.eclipse.ui.ide.prefs file and change the RECENT_WORKSPACES value to the desired location. If you want that Eclipse prompts you to select workspace when you start it, change the SHOW_WORKSPACE_SELECTION_DIALOG value to true. And you are done!