Skip to main content

Hadoop for beginners, by a beginner (Part 1) - Using Cloudera Quickstart VM

So you have read white papers, blogs and even some books on what is Big Data and how it is transforming world by giving us insights into the data usage through advanced analytical strategies. You might also have read about Hadoop and Map/Reduce.

But now what? How do you begin? Theory is not going to cut it, right? You want to get your hands dirty and write some code and setup some clusters, right? Right. So let's start.

Admittedly, Hadoop is intimidating. Apart from having a plethora of software (first of all, you need a Linux box!) you also need to have a 'cluster' of machines because Hadoop running on one machine is not really what a real life Hadoop installation looks like. As a beginner you would like to quickly write a 'Hello World' of Hadoop rather than setting up environment.

Easiest way to start instantly is using Cloudera's Quickstart VM for Hadoop.(Cloudera is one of the three biggest Hadoop distributors). First of all we need to install a virtualization software like Oracle Virtual Box or VMWare. Cloudera's Quickstart VM gives you a CentOS (a Linux distribution) installation with entire suite of Hadoop software along with 'Cloudera Manager' which is a browser based tool to manage these software.

Download and install Oracle VM VirtualBox from here. Once you are done, download Cloudera's Quickstart VM for Oracle VM VirtualBox from here.

Start VirtualBox and click on File->Import Appliance and select the Quickstart VM you just downloaded. It will ask you to assign some RAM, hard disk, processors and location for snapshot (C:\Users directory by default). Ideally 6-8 GB RAM, 20-40 GB HD and minimum of 2 cores are required.

It will take a few minutes. Once it is done, you will get this screen:


Click on 'Start' button and once the VM is ready you will see this.

Click on 'Launch Cloudera Express'. Follow the instructions. Open the URL for Cloudera Manager in the browser. Congratulations, you have setup your Hadoop 'Learning' Environment successfully!

You will see various services in stopped status. Go ahead and start HDFS, YARN and HUE (in this order and others later). 

Click on Hue and go to Hue home page and click on Hue Web UI. You can now interact with various Hadoop services using a web interface (HUE stands for 'Hadoop User Experience' which gives you a UI alternative to command line.)

Click on 'File Browser' on top right side of the page. Explore the file browser and try the various actions available. Play around. Give yourself a pat on the back!

I will be back with next steps in Part 2.


Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in.

Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location
Here is my URL:…

File upload problem: UTF-8 encoding not honored when form has multipart/form-data

The problem that I was facing was something like this. I was using Apache Commons File Upload library to upload and download some file.

I had a form in which user can upload a file and another field 'name' in which she can give any name to the file being loaded.

When I submitted the form, the file was uploaded fine but the value in name field was garbled. I followed all the possible suggestions I found:

<%@page pageEncoding="UTF-8"%> set. <%@page contentType="text/html;charset=UTF-8"%gt; set after the first directive. <meta equiv="Content-Type" content="text/html;charset=UTF-8"> in the head. enctype="multipart/form-data" attribute in the form. accept-charset="UTF-8" attribute in the form.
in the Servlet:
before doing any operations on request object: request.setCharacterEncoding("UTF-8"); For accessing the value

FileItem item = (FileItem);

if (item.isFormField()) {

//For regular…

Uploading and Retrieving images on Google Cloud Storage

You would already be aware that there are multiple options given by Google Cloud Platform to store data. Here is Google documentation on when to use which option: Google recommends using Google Cloud Storage (GCS) to store static content like files/videos etc. There is something called 'Blobstore' as well which is also used to store such content but it is on the way to being deprecated. This page talks about using GCS to store images. Look at this page to understand basic requirements for setup of GCS. In the Cloud Store Browser below following buckets are already available. If you select any bucket, you would be able to see the objects created in it.  Here you can see the image file in the '' bucket. You won't be able to add/delete files or folder from the browser if you don't have proper access but through code (running with the service account) it should not be a problem. Objects on GCS are immutable so you can't edit an ob…