Skip to main content

ADF: Using Input Range Slider (RichInputRangeSlider)

Input Range Slider is one of the coolest components of ADF which is not only very intuitive for users but also gives a 'cool' look to the page :)



So how do we get minimum and maximum values selected on the slider?

Among other properties of this component are 'minimum' and a 'maximum' and then there is 'value'. This causes some confusion as to  how can we get the values selected on the slider. This is not half as complex as it seems!

Let's focus first on the 'value'. The value of this component is returned in form of a oracle.adf.view.rich.model.NumberRange object.

oracle.adf.view.rich.model.NumberRange minMax = (oracle.adf.view.rich.model.NumberRange)getInputRangeSlider().getValue();

You can extract both the minimum and maximum values from this object like this:


        int startValue = minMax.getMinimum().intValue();
        int endValue = minMax.getMaximum().intValue();

and that's it. You have both the values returned by the slider.

'Minimum' and 'Maximum' attributes define the minimum and maximum values that your slider will show. By default minimum is 0 and maximum is 10 and the slider shows numbers from 0 to 10. You can set your own values programmatically for both the attributes.


minimum="#{pageFlowScope.MyBackingBean.minimumValue}"
maximum="#{pageFlowScope. MyBackingBean.maximumValue}"


and on runtime your slider will show the values set by these EL expressions.

Among other common properties are orientation and increment related properties using which you can change orientation of the slider from horizontal to vertical or change the minimum increment value to define minimum difference between minimum and maximum values.

So go on. Take this range slider out for a spin!

Comments

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a

java.lang.IllegalArgumentException: Malformed \uxxxx encoding

I was getting this exception during build while running ant. Googling didn't help much and I was flummoxed because the same code was running fine till now. My code reads a text file and does some operations on the basis of values read. It was only when I saw the text files I understood the error. I had copied the text in wordpad and saved it as .txt file. Wordpad had put lot of formatting information before and after the content. Also there was "\par" after every line, which was giving this error. So moral of the story: if you get this exception check your properties file (or any other file that your code might be reading.)