0
Sponsored Links


Ad by Google
In my previous post, we have already configured Hadoop Single Node Cluster here and also executed a Hello World program in MapReduce here. In this post, going to show you the data types used in MapReduce job.

MapReduce is a programming model to process the Big Data and works on top of HDFS, of-course one can say MapReduce is a heart of Hadoop framework.
Hadoop MapReduce works in Key-Value pair pattern, and key can be anything the only criteria is key class must be implement org.apache.hadoop.io.WritableComparable interface and value class must be implementation of org.apache.hadoop.io.Writable interface.

Hadoop provides various built in data types which can be use as key and value apart from that you can create your own key value class by implementing Writable and WritableComparable interfaces. In this post, I am not going to create custom data types, will only listing the built in data types used in MapReduce programming and those are listed below with an running example. For custom data type you may follow my next article Custom Data type in Hadoop.

Writable implementation classes(Class can be used as a value in MapReduce programming)
  1. ArrayWritable
  2. ArrayPrimitiveWritable
  3. TwoDArrayWritable
  4. AbstractMapWritable
  5. MapWritable
  6. SortedMapWritable
  7. EnumSetWritable
  8. CompressedWritable
  9. VersionedWritable
  10. ObjectWritable
  11. GenericWritable
WritableComparable implementation classes(Can be used as a key in MapReduce programming)
  1. BooleanWritable
  2. ByteWritable
  3. ShortWritable
  4. IntWritable
  5. VIntWritable
  6. FloatWritable
  7. LongWritable
  8. VLongWritable
  9. DoubleWritable
  10. NullWriteable
  11. Text
  12. ByteWriteable
  13. MD5Hash
Example
package com.javamakeuse.datatype;

import org.apache.hadoop.io.ArrayWritable;
import org.apache.hadoop.io.BooleanWritable;
import org.apache.hadoop.io.DoubleWritable;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.MapWritable;
import org.apache.hadoop.io.Text;

public class HadoopDataTypeExample {

 public static void main(String[] args) {
  // String data type
  Text txtString = new Text("String Text");
  // Integer data type
  IntWritable intWrapper = new IntWritable();
  // setting value
  intWrapper.set(4);
  // Long data type
  LongWritable longWrapper = new LongWritable();
  longWrapper.set(10);
  // Boolean data type
  BooleanWritable booleanWrapper = new BooleanWritable();
  booleanWrapper.set(true);
  // Double data type
  DoubleWritable doubleWrapper = new DoubleWritable();
  doubleWrapper.set(99);
  System.out.println("WritableComparable example");
  System.out.println("Text : " + txtString);
  System.out.println("Integer : " + intWrapper);
  System.out.println("Long : " + longWrapper.get());
  System.out.println("Boolean : " + booleanWrapper);
  System.out.println("Double : " + doubleWrapper);

  System.out.println("Writable example");
  // ArrayWritable example
  ArrayWritable aw = new ArrayWritable(LongWritable.class);
  aw.set(new LongWritable[] { longWrapper, new LongWritable(5) });

  for (LongWritable longValues : (LongWritable[]) aw.get()) {
   System.out.println("long value  " + longValues.get());
  }

  // MapWritable example
  MapWritable mw = new MapWritable();
  mw.put(intWrapper, aw);

  System.out.println("Map size : " + mw.size());
 }
}
OUTPUT :
WritableComparable example
Text : String Text
Integer : 4
Long : 10
Boolean : true
Double : 99.0
Writable example
long value 10
long value 5
Map size : 1

Done !! Happy Data Analytic
Sponsored Links

0 comments:

Post a Comment