Talend ETL Developer 8/26/2016
JOB DESCRIPTIONAPPLY We are looking for TALEND ETL DEVELOPER for our client in HARRISBURG, PA
JOB TITLE: TALEND ETL DEVELOPER
JOB LOCATION: HARRISBURG, PA
JOB TYPE: CONTRACT - 12 MONTHS / CONTRACT TO HIRE / DIRECT HIRE
Looking for an ETL Talend developer with Java and good interpersonal skills to work with our dynamic team in developing integration product for loading Big Data databases. This individual will be responsible for creating Talend ETL processes for loading datasets into the Enterprise Data Hub from Internal as well External Data Sources.
* Hadoop (Cloudera), HDFS, Hive, IMPALA, Pig, Tableau, Oozie, TALEND, SSIS
* MySQL, MS SQL Server, Oracle
* 7+ Years of IT Experience
* 3 + Years of experience on Hive/HQL and Pig.
* 3+ years of Big Data/Hadoop ETL experience using Hive/Pig/Oozie
* 3+ years of traditional ETL experience on RDBMS
* Expertise with Talend in Hadoop Environment is a MUST.
* Strong Linux shell scripting and Linux knowledge. Strong Expertise working in / Understanding of Big data a technologies with strong focus on Hortonworks .
* Strong knowledge of Flume, Sqoop, Hive, Pig, Map Reduce, oozie, Yarn applications with hands on experience - all or most of these.
* Hands on Talend development and integration experience using Talend enterprise integration.
* Must have excellent and in-depth knowledge in SQL, PL/SQL, stored Procedures
* Must have experience in Software Design Patterns, Best Practices
* Ability to create normalized/de-normalized database schemas
* Ability to perform ETL operations/reporting from multiple sources including internal, external using appropriate tools.
* Demonstrated ability to implement and troubleshoot backup solutions, database security, user management, and data maintenance tasks.
* Experience with Data Integration/Business Intelligence tools such as Pentaho, Talend, Informatica, etc.
* Experience in analyzing text, streams with emerging Hadoop-based big data, NoSQL.
* Hands on experience with Running Pig and Hive Queries.
* Analyzing data with Hive, Pig and HBase Data scrubbing and processing with Oozie. Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. Loading data into HDFS. Developing MapReduce Program to format the data.
SOFT SKILLS/OTHER SKILLS:
* Communication skills, Team player, Positive attitude, Self-starter and self-organizer, Ability to work in a high pressure fast paced environment, Code reviews/ensure best practices are followed.