Similar Jobs

View More

Hadoop Engineer

This job is no longer active. View similar jobs.

POST DATE 8/18/2016
END DATE 12/19/2016

Verizon Wireless Inc. Basking Ridge, NJ

Basking Ridge, NJ
AJE Ref #
Job Classification
Full Time
Job Type
Company Ref #



* Big Data Dev Ops
* Bachelor's Degreerequired, Master's Degree or higher preferred


--Streamline and enhance the day-to-day operational workflow of our enterprise-level Hadoop environment.
--Constantly monitor, measure, and debug performance of a system streaming GBs of data per day.
--Work closely with Big Data Architect and Business Owners to ensure performance of system is consistent with intended design and business cases, while looking for ways to simplify processes, improve data ingestion, analysis, and delivery, and optimize the use of resources.
--Suggest future improvements, risks, challenges, or strategies for the platform as it develops and grows into the future.
--Create and present reports, presentations, and visualizations to technical leads and executives demonstrating functionality of the platform and justifying operational behavior.

--Thorough and extensive knowledge of the Hadoop ecosystem and distributed computing, including but not limited to Hadoop, Hive, HBase, MapReduce, Zookeeper, YARN, Flume, Tez, Spark, Storm, Kafka, Ambari, Mahout, Flink, Talend, Sqoop, Oozie, Zeppelin, Flume
--Expert at writing and debugging multiple scripting languages (R, Python, Java, Pig, Oozie) for low-level processing, scheduling tasks, analytics, and similar
--Understand multiple Linux distributions at a very deep level (RHEL required) running in the cloud, containers, or bare metal
--Expert of monitoring and debugging tools and practices, and capable of surfacing performance metrics and other KPIs to leadership to provide operational summaries and checkpoints
--Knowledge of modern security best practices and techniques for encrypting data in transit and at rest, protecting data privacy without sacrificing performance or data analysis capabilities
--Knowledge and experience interacting with application servers and web servers such as Nginx, Redis, IBM WebSphere, Tomcat, WebLogic, etc.
--Experience with ETL applications and techniques using Flume, Sqoop, Talend, Sybase, etc.
--Experience with virtualization technologies and cloud

--Excellent interpersonal, oral, and written communication skills
--Highly motivated and success-driven with a strong sense of ownership
--Comfortable working in a fast-paced, Agile, competitive environment
--Ability to work independently and in group environments
--Ability to problem solve effectively and efficiently

--Bachelor's or Master's degree in Computer Science, Computer Engineering, or related field
--6+ years experience performing DevOps primarily on Hadoop ecosystems employing above stack elements, having owned/maintained 2 or more unique running systems in that time period
--6+ years of scripting experience with Python, R, Scala, Pig, Oozie, Java or similar
--3+ years of recent experience designing or maintaining secured environments using Kerberos, PKI, ACLs, etc
--2+ years of ETL experience with tools like Flume, Sqoop, Talend or similar