Big Data Modeler/Architect
This job is no longer active.
View similar jobs.
POST DATE 9/11/2016
END DATE 10/11/2016
JOB DESCRIPTIONBig Data Modeler/Architect
Global VISA and Relocation Specifications: K-C will not provide support for relocation of the chosen candidate for this role, if necessary, relocation will be at the candidate's own expense. However, K-C will provide assistance and financial support to obtain immigration visas and work authorization for the chosen candidate, if needed.
Job Summary The Big Data Modeler/Architect is responsible for defining and delivering on various Kimberly-Clark data platforms, from both a discipline and technology standpoint, to support Kimberly-Clark's global BI environment. The Big Data Architect/ Modeler sets project technical direction and provides oversight and checkpoints to enforce appropriate standards and quality of solutions. The Big Data Architect/ Modeler is responsible for collaborating with Kimberly-Clark's internal business and technology partners like the SIA EDM&A Enterprise Architect and other stakeholders to ensure the correct technology is in place to deliver timely and accurate information to support enterprise decision-making.
The Big Data Modeler/Architect, in collaboration with the SIA EDM&A Enterprise Architect, researches new big data technologies, tools and methodologies as they emerge. This person will be responsible for understanding traditional data modeling and specific usage at Kimberly-Clark, as well as designing and implementing strategies, architectures, and data ingestion and consumption processes for complex, large-volume, multi-variety, batch and real time data sets used for modeling, data mining, dash boarding and reporting purposes. The Big Data Architect/ Modeler will define and refine disciplines, mentor team members on standards and review all designs to ensure that project teams are meeting expectations for quality and conformity.
This role is also responsible for helping Kimberly-Clark develop a strong data platform team. As part of their responsibilities, this person will develop internal training curriculum and provide coaching and mentoring to internal project team members. They would act as the technical lead and be responsible for their ongoing development and connection to Kimberly-Clark's Enterprise Architecture team, delivery partners, and other key stakeholders.
The Big Data Modeler/Architect is viewed as an expert in making sense of complex data environments, encompassing both business data and process understanding and technical expertise. Provides technical consulting on complex projects. Acts as a source of direction, training and guidance for other team members. Is knowledgeable in industry best practices in their area of expertise and uses resources.
* Provide leadership and guidance to project teams and other architects on all aspects of BI architecture (RDBMS (SQL, Netezza, etc), Hadoop architecture (if applicable), SAP Modeling (HANA or BW, if applicable) normalized and dimensional data modeling, ETL, reporting, etc.
* Provide guidance and consultation on specific delivery methodologies for BI projects, such as technical requirements and model design review.
* Work with the Enterprise Data Modeler (Information Management team) to set strategy and oversee design for significant data modeling work, such as Enterprise Logical Models, Conformed Dimensions, Enterprise Hierarchy etc.
* Lead efforts to define/refine execution standards for all data warehouse layers (ETL, data modeling, MOLAP/ROLAP/OLAP, reporting, platform etc.).
* Participates in meetings to review the design of BI projects. This will include high-level design of the overall solution and detailed design of components as needed (Data Warehousing, ETL, user interface, analysis/reporting, etc.).
* Regularly interact with BI leadership on project work status, priority setting and resource allocations. Provide assistance to project teams as they go before change control boards to implement their projects into production.
* Research new tools and/or new architecture and review with project teams as applicable
* Work with support team to define methods for implementing solutions for performance measurement and monitoring
* Assist infrastructure leads and BI Delivery teams as needed with background and information on all technologies in use for projects such as new version upgrades, migration of hardware, production issues, etc.
* Provide leadership and guidance on setting up environments used by the BI team so that they are optimized for a leveraged, multi-tenant operation
* Designs and implements data ingestion techniques for real time and batch processes for structured and unstructured data sources into Hadoop ecosystems and HDFS clusters.
* Designs strategies and programs to collect, store, analyze and model data from internal/external sources. Awareness and understanding of public data sets ability to ingest and integrate.
* Development and implementation of data design methods, data structures, and modeling standards. Implement industry standard development policies, procedures and standards
* Leverages industry networking and contacts to benchmark with peer customers and share knowledge and best practices.
* Bachelor's Degree (Masters Preferred)
* 8+ years overall experience in IT applications development or consulting related to IT applications development
* 5+ years in technical development and leadership roles on BI and Data Warehouse projects with significant experience in the majority of these activities:
* o Designing star schema data models
* o Designing normalized data models for historical data warehouses
* o Design and Developing with ETL tools
* o Designing and developing with MOLAP/ROLAP/OLAP Tools
* o Designing and developing with Relational Reporting Tools
* Managing or coordinating the time and activities of other people and other groups
* ERWin for Data Modeling or similar
* Relational DBMS experience with appliance-like platforms, such as Netezza (preferred), Teradata, or others
* Experience with leading ETL tools, such as SAP Data Services (preferred), MS SSIS, Informatica, etc
* Experience with leading OLAP/ROLAP tools (Business Objects, Microsoft, Microstrategy Hyperion Essbase or Cognos)
* Experience with relational reporting tools (Microsoft, Business Objects etc.)
Preferred Technical Skills
* Experience with the major big data solutions like Hadoop, Map Reduce, Hive, Pig and other tools in the Hadoop ecosystem.
* Understanding of major programming/scripting languages like Python, Scala
* Experience working with large data sets and distributed computing tools
* Knowledge on NoSQL platforms
* Basic knowledge of machine learning, statistics, optimization or related field is a plus
* Experience modeling data in SAP BW and SAP HANA
* Demonstrated experience and success managing projects
* Planning and administration of project tasks and dependencies
* Facilitating cross-functional requirement gathering meetings
* Facilitating team status meetings
* Developing project related documentation
* Developing relationships with internal Customers and Service Providers
* Understanding of Agile/Scrum methodologies
Kimberly-Clark and its well-known global brands are an indispensable part of life for people in more than 150 countries. Every day, 1.3 billion people - nearly a quarter of the world's population - trust K-C brands and the solutions they provide to enhance their health, hygiene, and well-being. With brands such as Kleenex, Scott, Huggies, Pull-Ups, Kotex, and Depend, Kimberly-Clark holds No.1 or No. 2 share positions in more than 80 coun