In Hadoop, By default Logging level will be determined by logging configuration variable HADOOP_ROOT_LOGGER in hadoop-env.sh file or by hadoop.root.logger property in log4j.properties file.
Default logging configuration is:
1 2 3 4 |
export HADOOP_ROOT_LOGGER = "INFO,console" hadoop.root.logger=INFO,console |
i.e. logging level is INFO and logging destination is console.
But some times, this setting results in lot of Warning messages on console and leads to confusion in finding the actual required messages on the console. Below are the sample warning messages which are not necessary on the console.
So, in order to suppress warning messages in hadoop on the console, we can override the default logging configuration by setting logging level to WARN and destination to DRFA.
In hadoop-env.sh:
Add the below two lines at the bottom of hadoop-env.sh file.
1 2 3 |
export HADOOP_HOME_WARN_SUPPRESS=1 export HADOOP_ROOT_LOGGER="WARN,DRFA" |
Now run the same ‘$ hadoop fs -ls /’ command and verify whether warnings are suppressed or not.
In the below screen, the warning messages are suppress for the second command.
In log4j.properties file:
The same effect can be achieved by editing the hadoop.root.logger property to below value:
1 2 3 4 5 |
# Define some default values that can be overridden by system properties hadoop.root.logger=WARN,DRFA hadoop.log.dir=. hadoop.log.file=hadoop.log |
Note:Â Here DRFA denotes that Daily Rolling File Appender which is a log file into which these warn messages are appended. In the above case it is hadoop.log file in hadoop’s log directory.
So, now we can see these warning messages appended to hadoop.log file.