Hadoop Lead / Architect
This job is no longer active.
View similar jobs.
POST DATE 8/16/2016
END DATE 12/19/2016
JOB DESCRIPTIONPOSITION: HADOOP LEAD/ARCHITECT
LOCATION: ENGLEWOOD, CO
* Responsible for defining the overall technical architecture for the Big Data platform and setting up the requiredinfrastructure to support the functional use cases
* Provide technical leadership for design and development of large scale cluster data processing systems
* Establish the reference architecture, processes, standards, framework
* Taking into consideration the technology value system (performance, scalability, reliability, operability, flexibility, affordability, and auditability) and the high-level business and technical objectives for the solution, identify solution architecture design alternatives based on business's objective
* Propose solutions that deliver the most value to the business based on their constraints and balancing that against our value system.
* Ensure the solution integrates with the clients existing BI systems.
* Assess the readiness of the technical environment (including the systems, tools, technologies, processes, and resource requirements) to implement the architecture.
* Propose, recommend or facilitate the selection of appropriate tools, techniques, and resources for all technical components of the solution.
* Review all deliverables for technical completeness and to ensure the performance, flexibility, operability, maintainability and scalability of the proposed technical solution
* Ensure compatibility with overall IT infrastructure and compliance with all IT policies.
* Provide expertise and direction around operations, data governance and parallel run data validation activities
* Ensure proper selection of appropriate project software, tools and development techniques for the different components of the Big Data Platform
* Be able to work in a fast-paced agile development environment.
* Recent hands-on experience in designing and developing java-based secured, high-availability, enterprise-level platforms/products
* Direct experience on Architectural and design patterns such as n-tier, lambda etc.
* Hands on experience on Hadoop Ecosystem technologies like HBase, MapReduce, Spark, Pig, Hive, Flume, Sqoop, Cloudera Impala, Zookeeper, Oozie, Hue, Kafka
* Experience in Capacity Planning, Cluster Designing and Deployment, Troubleshooting and tuning the clusters
* Exposure to high availability configurations, Hadoop cluster connectivity and tuning, and Hadoopsecurity configurations
* Strong experience with Cloudera/HortonWorks/MapR versions along with Monitoring/Alerting tools (Nagios, Ambari, Cloudera Manager)
* Experience in Hadoop Cluster migrations or Upgrades
* Strong knowledge in RESTful Web Services
* Strong scripting skills in Python / Shell Scripting.
* Deep knowledge and experience with J2EE platform, Hibernate, Spring
* Strong knowledge of SQL, MS SQL Server, Oracle, MySQL or PostgreSQL
* Knowledge of hybrid cloud solutions (Azure, Google Compute Engine and Amazon Cloud) with hands-on experience in direct API integration is an advantage
* Must have a track record of delivering products and experience with the complete software lifecycle, including software requirements and design specification, definition, and creation/verification of engineering test procedures.