Leave a comment

Your email address will not be published. Required fields are marked *

3 thoughts on “Processing Logs in Hive

  • vamsi

    could you please explain what is the purpose below thing of in create table statement?
    WITH SERDEPROPERTIES (
      “input.regex” = “([^ ]*) ([^ ]*) ([^ ]*) (-|\\[[^\\]]*\\]) ([^ \”]*|\”[^\”]*\”) (-|[0-9]*) (-|[0-9]*)”,
      “output.format.string” = “%1$s %2$s %3$s %4$s %5$s %6$s %7$s”
    )

  • Gopal

    Awesome and nice blog and it is more transparent show case of real time example of log data processing on Hadoop environment….Great work….and mentor to may people…


Review Comments
default image

I have attended Siva’s Spark and Scala training. He is good in presentation skills and explaining technical concepts easily to everyone in the group. He is having excellent real time experience and provided enough use cases to understand each concepts. Duration of the course and time management is awesome. Happy that I found a right person on time to learn Spark. Thanks Siva!!!

Dharmeswaran ETL / Hadoop Developer Spark Nov 2016 September 21, 2017

.