Must Have Skills (Top 3 technical skills only)*:
Hadoop ecosystem (Hive, Spark, Kafka, hBase, Oozie, Sqoop etc) Bigdata tools such as Sqoop, Hive, Spark, Scala, hBase, Mapreduce Java, Python, ScalaDetailed Job Description:
5 years of experience in Hadoop eco system3 to 5 years of hands on experience in architecting, designing, and implementing data ingestion pipes for batch, realtime, and streams.3 to 5 years of hands on experience with proven track record in building data lakes on HDInsight Azure platform.3 to 5 years of hands on experience in Bigdata tools such as Sqoop, Hive, Spark, Scala, hBase, Mapreduce etc.1 to 3 years of experience in JavaPythonScala.Having hands on experience in using Infoworks Nifi will
Minimum years of experience*: 5+
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
5 years of experience in Hadoop eco system 3 to 5 years of hands on experience in architecting, designing, and implementing data ingestion pipes for batch, realtime, and streams. 3 to 5 years of hands on experience with proven track record in building data lakes on HDInsight Azure platformInterview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No