Skip to main content

Java Design Patterns : Adapter Pattern

Whenever and wherever you will read about Adapter pattern you will come across example of electric adapters. Be it a 3-pin to 2-pin adapter, round pin to flat pin adapter or AC to DC adapter, its purpose is to act as an intermediary between electrical equipment and the socket.


Above is a representation of a 3-pin to 2-pin adapter which exposes a 3-pin socket which takes in a 3 pin plug and gives a 2 pin output. (These 2-pins can be plugged into any 2-pin socket.) 

Note that the adapter contains a 3-pin plug inside and behavior of its 3 pins is altered to suit the behavior of the adapter pins which will fit in the 2-pin socket.

Now if we interpolate this in terms of Java concepts, the 3-pin socket is an interface and every 3-pin plug implements this interface. Let's call this as 'adaptee' interface. Similarly every 2-pin socket is also an interface which is implemented by every 2-pin plug. Let's call the 2-pin socket (which our adapter is trying to be compatible with) is our target interface. 

Let's see some code:

This is our adaptee interface ThreePinSocket:

public interface ThreePinSocket{
 public double getPinOneOutput(); 
 public double getPinTwoOutput();
 public double getPinThreeOutput();
}

and ThreePinPlug implements it (assuming that top pin gives 1/2 of total input voltage and two bottom pins give 1/4th voltage each)


public class ThreePinPlug implements ThreePinSocket{
  int totalInputVoltage;
  
  public ThreePinPlug(int voltage){
    totalInputVoltage = voltage;
  }  
  public double getPinOneOutput(){
    return totalInputVoltage/2;
  }
  public double getPinTwoOutput(){
    return totalInputVoltage/4;
  }
  public double getPinThreeOutput(){
    return totalInputVoltage/4;
  }
}

This is our target interface TwoPinSocket:

public interface TwoPinSocket{
 public double getPinOneOutput(); 
 public double getPinTwoOutput();
}

implemented by TwoPinPlug (assuming that both pins give 1/2 of total input voltage each)

public class TwoPinPlug implements TwoPinSocket{  
  int totalInputVoltage;
  
  public TwoPinPlug(int voltage){
    totalInputVoltage = voltage;
  }  
  public double getPinOneOutput(){
    return totalInputVoltage/2;
  }
  public double getPinTwoOutput(){
    return totalInputVoltage/2;
  }
}

Now let us write the code for our adaptor. Note that it consists of a ThreePinSocket and as we said earlier we are altering the behavior of its pins. So its second pin is now giving 1/2 of total voltage just like any 2 pin plug. 

public class ThreeToTwoPinAdaptor implements TwoPinSocket{
  ThreePinSocket threePinSocket;
  
  public ThreeToTwoPinAdaptor(ThreePinSocket threePinSocket){
    this.threePinSocket = threePinSocket;
  }
  
  public double getPinOneOutput(){
    return threePinSocket.getPinOneOutput();
  }
  public double getPinTwoOutput(){
    return 2*(threePinSocket.getPinTwoOutput());
  }  
}

Last piece of this puzzle is the client.

public class AdaptorClient{
  public static void main(String args[]){
    int inputVoltage = 100;
    
//create a 2pin plug.
    TwoPinSocket twoSock = new TwoPinPlug(inputVoltage);    
    getTwoPinOutput(twoSock);
    
//now let's create a 3pin plug.
    ThreePinSocket threeSock = new ThreePinPlug(inputVoltage);
    System.out.println("Pin1 Output: "+threeSock.getPinOneOutput()+" Pin2 Output: "+threeSock.getPinTwoOutput()+" Pin3 Output: "+threeSock.getPinTwoOutput());
    
//now we create an adaptor using 3pin plug created earlier
    ThreeToTwoPinAdaptor adaptor = new ThreeToTwoPinAdaptor(threeSock);

//Note this adaptor can easily be used in place of 2pin plug!
    getTwoPinOutput(adaptor);    
  }
  
  public static void getTwoPinOutput(TwoPinSocket twoPinSocket){
    System.out.println("Pin1 Output: "+twoPinSocket.getPinOneOutput()+" Pin2 Output: "+twoPinSocket.getPinTwoOutput());
  }
}

Output:

Pin1 Output: 50.0 Pin2 Output: 50.0
Pin1 Output: 50.0 Pin2 Output: 25.0 Pin3 Output: 25.0
Pin1 Output: 50.0 Pin2 Output: 50.0


Notice that getTwoPinOutput() expects a TwoPinSocket and our adaptor fits it so well!

This was adaptor pattern in action! Let's define it: Adaptor Pattern converts the interface of a class into another interface the clients expect.


Simple and pretty useful isn't it? 




Comments

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a

java.lang.IllegalArgumentException: Malformed \uxxxx encoding

I was getting this exception during build while running ant. Googling didn't help much and I was flummoxed because the same code was running fine till now. My code reads a text file and does some operations on the basis of values read. It was only when I saw the text files I understood the error. I had copied the text in wordpad and saved it as .txt file. Wordpad had put lot of formatting information before and after the content. Also there was "\par" after every line, which was giving this error. So moral of the story: if you get this exception check your properties file (or any other file that your code might be reading.)