Suppress Warning Messages in Hadoop


In Hadoop, By default Logging level will be determined by logging configuration variable HADOOP_ROOT_LOGGER in hadoop-env.sh file or by hadoop.root.logger property in log4j.properties file.

Default logging configuration is:

i.e. logging level is INFO and logging destination is console.

But some times, this setting results in lot of Warning messages on console and leads to confusion in finding the actual required messages on the console. Below are the sample warning messages which are not necessary on the console.

Hadoop WarningsHadoop Warn2

So, in order to suppress warning messages in hadoop on the console, we can override the default logging configuration by setting logging level to WARN and destination to DRFA.

In hadoop-env.sh:

Add the below two lines at the bottom of hadoop-env.sh file.

hadoop-env.shNow run the same ‘$ hadoop fs -ls /’ command and verify whether warnings are suppressed or not.

In the below screen, the warning messages are suppress for the second command.

Warn Suppress

In log4j.properties file:

The same effect can be achieved by editing the hadoop.root.logger property to below value:

Note: Here DRFA denotes that Daily Rolling File Appender which is a log file into which these warn messages are appended. In the above case it is hadoop.log file in hadoop’s log directory.

So, now we can see these warning messages appended to hadoop.log file.

hadoop.log


Profile photo of Siva

About Siva

Senior Hadoop developer with 4 years of experience in designing and architecture solutions for the Big Data domain and has been involved with several complex engagements. Technical strengths include Hadoop, YARN, Mapreduce, Hive, Sqoop, Flume, Pig, HBase, Phoenix, Oozie, Falcon, Kafka, Storm, Spark, MySQL and Java.

Leave a comment

Your email address will not be published. Required fields are marked *


Review Comments
default image

I am a plsql developer. Intrested to move into bigdata.

Neetika Singh ITA Hadoop in Dec/2016 December 22, 2016

.