Java Hadoop Developer 8/31/2016

Comtec Dallas, TX

Company
Comtec
Job Classification
Full Time
Company Ref #
29459855
AJE Ref #
576046247
Location
Dallas, TX
Job Type
Regular

JOB DESCRIPTION

APPLY
Greetings from ComTec Information Systems.

We are looking for Java/HadoopDeveloper for one of our clients located at Dallas, TX. Please go through the Job description below and let me know if you would be interested.

JOB TITLE: JAVA/HADOOP DEVELOPER

DURATION: PERMANENT

LOCATION: DALLAS, TX

Interview Process - 3 step process

* Coding exercise (emailed from Hacker Rank - 1 hour)

* Phone or onsite

* Onsite for 3 hours (1.5 technical, 1 management, 30 min leadership)

Your Role:

* Design and develop code that consistently adheres to functional programming principles.

* Design, develop, and maintain HIGH VOLUME JAVA AND SCALA BASED DATA PROCESSING batch jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Scalding, Cascading, Hive, Impala, Avro, Flume, Oozie, and Sqoop.

* Design and maintain schemas in the Vertica analytics database and write efficient SQL for loading and querying analytics data.

* Integrate data processing jobs and services with other applications using technologies such as RabbitMQ, Spring, MongoDB, ElasticSearch, Coherence, MySQL, etc.

* Write appropriate unit, integration and load tests using industry standard frameworks such as Specs2, ScalaTest, ScalaCheck, JMeter, JUnit, Cucumber, and Grinder.

* Live by Agile (particularly Scrum) principles and collaborate with team members using Agile techniques

What Client is Looking For:

* Degree in Computer Science or Engineering or an advanced degree in Math or Physics with significant exposure to high volume data processing, data mining and distributed systems is highly desirable.

* 3+ years of large scale server-side application development experience with 2 years designing and implementing HIGH VOLUME DATA PROCESSING JOBS.

* STRONG JAVA SKILLS WORKING KNOWLEDGE OF MAP-REDUCE IN HADOOP IS REQUIRED.

* Functional programming experience, and good working knowledge of Cascading and Scalding (or similar) Map-Reduce application framework is desired.

* Experience with Scala, Spark, and Kafka or interest in learning these technologies is desired.

* Excellent database development skills including advanced SQL and a solid understanding of databasetechnologies (both relational and NoSQL), and logical and physical data modeling.

* Strong analytical and problem solving skills and an understanding of common mathematical principles used in statistical analysis.