Develops, enhances, debugs, supports, maintains and tests software applications that support business units or supporting functions. These application program solutions may involve diverse development platforms, software, hardware, technologies and tools. Participates in the design, development and implementation of complex applications, often using new technologies. May provide technical direction and system architecture for individual initiatives. Serves as a fully seasoned/proficient technical resource. Will not have direct reports but may lead projects and direct activities of a team related to special initiatives or operations. May have responsibility for a project and project budget. May collaborate with external programmers to coordinate delivery of software application. Routine accountability is for technical knowledge and capabilities. Works under minimal supervision, with general guidance from more seasoned consultants. Typically requires 5-7 years of experience.
Mandatory Technical Skills:
- Solid understanding on OOP languages and must have working experience in C++, core java, J2EE
- Should have good Knowledge on Hadoop Cluster Architecture
- Hands on Hadoop and the Hadoop ecosystem required – Proven experience within CLOUDERA Hadoop ecosystems (MR1,MR2, HDFS, YARN, Hive, HBase, Sqoop, Pig, Hue, etc.)
- Design and implement Apache Spark based real time stream processing data pipeline involving complex data processing
- Hands-on experience developing applications using Big Data, Kafka, Cassandra, Apache Storm, Apache Spark and related areas
- Implement complex data processing algorithms in real time with optimized and efficient manner using Scala/Java
- Knowledge of any one of the scripting languages, such as Python, Unix Shell Scripting or PERL etc., is essential for this position.
- Excellent analytical & problem solving skills, willingness to take ownership and resolve technical challenges
- Experience in performing Proof-Of-concept for new technologies
- Must have experience working with external Venders/Partners like Cloudera, Horton, Datastax etc
- Strong communication, documentation skills & technology awareness and capability to interact with technology leaders is a must
- Good knowledge on Agile Methodology and the Scrum process.
Required Qualifications & Experience:
- 7+ year of Industry experience.
- Minimum 4-5 years of Big Data experience
Good To have
- Experience in Real time streaming ( Kafka )
- Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
- Visual Analytics Tools knowledge ( Tableau )
- Bachelor’s degree in Science or Engineering