NoClassDefFoundError or ClassNotFoundException


In this post we will discuss about most frequent error messages NoClassDefFoundError or ClassNotFoundException in hadoop mapreduce jobs execution and possible solutions for them.

Error Scenario:

java.lang.ClassNotFoundException       or java.lang.NoClassDefFoundError
Error starting MRAppMaster   &  Container exited with a non-zero exit code 1

When we encounter a situation of NoClassDefFoundError or ClassNotFoundException even though we have required jar files added to our build path in eclipse, then we have a short cut to resolve this error message at run time.

Generally in this situation, when we have required jar files added to build path in eclipse, then we will not receive any compile error and when we run the same program from eclipse itself then also we will not face any issue.

But this type of exceptions will occur when we create jar file from eclipse and run it from command line. If we are running it with java command and there is no -classpath specified then we can give the jar file containing our required classes in -classpath argument as shown below.

But when we are already running a program from a jar file then we can’t pass the second jar file as argument to -classpath argument. Below is a scenario where we submit a mapreduce program from a jar file using hadoop command. In this case we can’t pass -classpath argument to specify our required jar file.

In this case, if JVM can’t find required class definitions at run time in test.jar file we will get below types of error messages. These below error messages were occurred when hadoop did not find class definitions for JSONObject and JSONException classes when we are using these two classes in our import statements as shown below in program PartitionsByMultipleOutputs.java in post Mapreduce multiple outputs.

Below are three different types of error messages.

Resolution:

In these scenarios, we can bind the required class files into our jar file through command line.

As shown above the required classes org.json.JSONException & org.json.JSONObject are not available at run time. So, we will download the required jar file and extract the jar file.  As in our case, it is related to org.json.* files which will be located in org.json.*.jar file and it can be downloaded from maven repository.

In below,

  • we have downloaded json-20140107.jar file and copied to working directory and extracted it. and now org/ folder from extracted file is in working directory.
  • compiled our program PartitionByMultipleOutputs.java after having org/ in the same directory otherwise we will receive compile error message saying that package org.json does not exist. And class files created out of above program compilation are placed in one directory(classes).
  • Now, the org/ folder structure is copied into our class files directory (classes).
  • We have created a single jar with all the class files & folders (org/) present in classes directory.
  • Now if we use this jar to run the program PartitionsByMultipleOutputs, then we will not receive this class definition not found error messages.

Now we can use this jar file to run our programs from it.


About Siva

Senior Hadoop developer with 4 years of experience in designing and architecture solutions for the Big Data domain and has been involved with several complex engagements. Technical strengths include Hadoop, YARN, Mapreduce, Hive, Sqoop, Flume, Pig, HBase, Phoenix, Oozie, Falcon, Kafka, Storm, Spark, MySQL and Java.

Leave a comment

Your email address will not be published. Required fields are marked *


Review Comments
default image

I have attended Siva’s Spark and Scala training. He is good in presentation skills and explaining technical concepts easily to everyone in the group. He is having excellent real time experience and provided enough use cases to understand each concepts. Duration of the course and time management is awesome. Happy that I found a right person on time to learn Spark. Thanks Siva!!!

Dharmeswaran ETL / Hadoop Developer Spark Nov 2016 September 21, 2017

.