Work Location - Pune
Experience - 3 to 7 years
- Expert level knowledge of SQL and scripting preferably UNIX shell scripting, Perl scripting
- Working knowledge of Data integration solution
- Strong problem solving and logical reasoning ability.
- Inclination for working on open source
- The candidate should be well-versed with Any ETL tool, (Informatica / Datastage / Abinitio) and relational databases and have prior experience with Data Warehousing and Business Intelligence.
- Work with a team of Big Data, ETL and database developers to deliver projects of various complexities, enhancements and bug fixes.
- Database development experience with a solid understanding of core database concepts, relational database design, ODS & DWH.
- Should be able to provide development and maintenance support for BI/DWH programs in all environments
- Must be highly motivated and a self-starter with the ability to learn quickly.
- Excellent understanding of all aspects of the Software Development Lifecycle.
- Excellent written and verbal communication skills.
- Must be a team player
- The selected candidates will be gradually trained on Big Data Hadoop technologies like Hadoop, Hbase, Casandra, Pig, Hive, Sqoop, Flume, Mahout, storm, kafka, spark
- Would get chance to be part of enterprise grade implementation of Big Data systems
- Will play active role in setting up Modern data platform based on Big Data technologies.
- Advance analytics based on R, Mahout, Mlib
- Would be part of teams with rich experience in various aspects of distributed systems.
Salary: Not Disclosed by Recruiter
Industry: IT-Software / Software Services
Functional Area: IT Software - Application Programming, Maintenance
Role Category: Programming
Role: Software Developer