Skip to main content

Java implementation of Binary Heap (source code)

Binary Heaps are one of the easiest and most popular means to implement priority queues. You can get a lot of tutorials explaining the algorithms for insertion, deletion of the maximum (or minimum) element and using the heap for implementing priority queue.

Here is a simple Java code for these:



  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
import java.util.Arrays;

class BinaryHeapImpl {  
  private int arr[];
  private int heapSize = 0;
  
  BinaryHeapImpl(int size){
    arr = new int[size];
    //initialize the array with -1
    Arrays.fill(arr, -1);    
  }
  
  private int getParentIndex(int index){
    System.out.println("finding parent of index "+index);
    if(index == 0)
      return 0;
    return (index%2 == 0)?((index/2) - 1):index/2;
  }
  
  public void insert(int element){
    arr[heapSize] = element;
    //when any element comes compare it with its parent whose index is k/2    
    int parentIndex = getParentIndex(heapSize);
    int parent = arr[parentIndex];
    if(element > parent){
      swim(heapSize, parentIndex);
    }
    heapSize++;
  }
  
  private void swim(int index, int indexParent){
    System.out.println("Swim :"+ arr[index]+"("+index+")" +" parent " + arr[indexParent] +"("+indexParent+")");
    if(indexParent >= 0){
      if(arr[index] > arr[indexParent]){
        exchange(index, indexParent);
        System.out.println("After exchange " +arr[index] +"  "+arr[indexParent]);
        if(indexParent!=0){
          swim(indexParent, getParentIndex(indexParent)); 
        }
      }
    }
  }
  
  private void sink(int index){
    System.out.println("Sinking "+arr[index]+"("+index+")");
    if(2*index+2 < heapSize){
      if(arr[index] < arr[2*index+1] || arr[index] < arr[2*index+2]){
        
        //replace the lower element with the bigger of its children. 
        int changeIndex = (arr[2*index+1] > arr[2*index+2])? 2*index+1 : 2*index+2;
        exchange(index, changeIndex);
        sink(changeIndex);
      }
    }
  }
  
  private void exchange(int index1, int index2){
    int temp = arr[index1];
    arr[index1] = arr[index2];
    arr[index2] = temp;
  }
  
  public int getHeapSize(){
    return heapSize;
  }
  
  public void showHeap(){
    System.out.print("\nShowing heap:");
    for(int i = 0; i< heapSize; i++){
      System.out.print(arr[i]+"  ");
    }
  }
  
  public int delMax(){
   //replace top with last.
    int max = arr[0]; //this is to be returned
    int lastIndex = heapSize - 1;
    int last = arr[lastIndex];
    arr[lastIndex] = -1;
    
    //Move the last element to top and then sink it down
    arr[0] = last;
    sink(0);
    
    //reduce the heapSize
    heapSize--;
    return max;
  } 
  
}

public class BinaryHeap{
   
  public static void main(String args[]){
   int arr[];
   int size = 10;  
    arr = new int[size];
  
    //Create tree.
    BinaryHeapImpl bh = new BinaryHeapImpl(10);
    bh.insert(5);
    bh.showHeap();
    bh.insert(4);
    bh.showHeap();
    bh.insert(3);
    bh.showHeap();
    bh.insert(6);
    bh.showHeap();
    bh.insert(2);
    bh.showHeap();
    bh.insert(7);
    bh.showHeap();
    bh.insert(8);
    bh.showHeap();
    bh.insert(1);
    bh.showHeap();
    bh.insert(9);
    bh.showHeap();
    //bh.insert(0);
    System.out.println("Current Heapsize = "+bh.getHeapSize());
    
    System.out.println(bh.delMax());
    bh.showHeap();
    System.out.println(bh.delMax());
    bh.showHeap();
    System.out.println(bh.delMax());
    bh.showHeap();
    System.out.println(bh.delMax());
    bh.showHeap();
    System.out.println(bh.delMax());
    bh.showHeap();
  }
  
  
}

Post in comments if you need any clarification of this code.

Comments

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a

java.lang.IllegalArgumentException: Malformed \uxxxx encoding

I was getting this exception during build while running ant. Googling didn't help much and I was flummoxed because the same code was running fine till now. My code reads a text file and does some operations on the basis of values read. It was only when I saw the text files I understood the error. I had copied the text in wordpad and saved it as .txt file. Wordpad had put lot of formatting information before and after the content. Also there was "\par" after every line, which was giving this error. So moral of the story: if you get this exception check your properties file (or any other file that your code might be reading.)