Big Data Engineer

This job is no longer active. View similar jobs.

POST DATE 9/2/2016
END DATE 10/18/2016

Modis Lone Tree, CO

Company
Modis
Job Classification
Full Time
Company Ref #
23173807.21601016
AJE Ref #
576068918
Location
Lone Tree, CO
Job Type
Regular
Required Licenses/Certifications
df-aj

JOB DESCRIPTION

APPLY
span Job Description br We are looking for a Big Data Engineer who has passion for big data technologies and comes with data warehousing background. Someone who has experience in designing and coding batch as well as real time ETL and one who wants to be part of a team that is actively designing and implementing the big data lake and analytical architecture on Hadoop. br Job Responsibilities: br • Designing schemas, data models and data architecture for Hadoop and HBase environments br • Building and maintaining code for real time data ingestion using Java, MapR-Streams (Kafka) and STORM. br • Implementing data flow scripts using Unix /Sqoop / Hive QL / Pig scripting br • Designing, building and support data processing pipelines to transform data using Hadoop technologies br • Designing, building data assets in MapR-DB (HBASE), and HIVE br • Developing and executing quality assurance and test scripts br • Work with business analysts to understand business requirements and use cases br   br Job Requirements: br • Hands-on experience in Java object oriented programming (minimum 3 years) br • Hands-on experience with Hadoop, MapReduce, Hive, Pig, Sqoop, Flume, STORM,  SPARK, Kafka and HBASE (At least 2 years) br • Understanding Hadoop file format and compressions is required br • Familiarity with MapR distribution of Hadoop is preferred br • Understanding of best practices for building Data Lake and analytical architecture on Hadoop is required br • Strong scripting / programming with UNIX, Java, Python, Scala etc. is required br • Strong SQL experience with the ability to develop, tune and debug complex SQL applications is required br • Expertise in schema design, developing data models and proven ability to work with complex data is required br • Experience in real time data ingestion into Hadoop is required br • Proven experience in working in large environments such as RDBMS, EDW, NoSQL, etc. is preferred br • Knowledge of Big Data ETL such as Informatica BDM and Talend tools is preferred br • Understanding security, encryption and masking using Kerberos, MapR-tickets, Vormetric and Voltage is preferred br • Experience with Test Driven Code Development, SCM tools such as GIT, Jenkins is preferred br • Experience with Graph database is preferred br   br /span