Cummins HUB Jobs

Job Information

Cummins Inc. Data Engineer in Pune, India

DESCRIPTION

Job Summary:

Cummins is seeking a skilled Data Engineer to support the development, maintenance, and optimization of our enterprise data and analytics platform. This role involves hands-on experience in software development , ETL processes , and data warehousing , with strong exposure to tools like Snowflake , OBIEE , and Power BI . The engineer will collaborate with cross-functional teams, transforming data into actionable insights that enable business agility and scale.

Please Note: While the role is categorized as remote, it will follow a hybrid work model based out of our Pune office .

Key Responsibilities:

  • Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools.

  • Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems.

  • Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods.

  • Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders.

  • Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions.

  • Participate in data modeling and storage architecture using star and snowflake schema designs.

  • Contribute to the implementation of data governance , metadata management , and access control mechanisms .

  • Maintain documentation for solutions and participate in testing and validation activities.

  • Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture .

  • Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes.

RESPONSIBILITIES

Competencies:

  • Data Extraction & Transformation – Ability to perform ETL activities from varied sources with high data accuracy.

  • Programming – Capable of writing and testing efficient code using industry standards and version control systems.

  • Data Quality Management – Detect and correct data issues for better decision-making.

  • Solution Documentation – Clearly document processes, models, and code for reuse and collaboration.

  • Solution Validation – Test and validate changes or solutions based on customer requirements.

  • Problem Solving – Address technical challenges systematically to ensure effective resolution and prevention.

  • Customer Focus – Understand business requirements and deliver user-centric data solutions.

  • Communication & Collaboration – Work effectively across teams to meet shared goals.

  • Values Differences – Promote inclusion by valuing diverse perspectives and backgrounds. Education, Licenses, Certifications:

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related technical discipline.

  • Certifications in data engineering or relevant tools (Snowflake, Power BI, etc.) are a plus.

Experience

Must have skills:

  • 5–7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment.

  • Proficient in ETL tools , SQL , and data warehouse development .

  • Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies.

  • Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases.

  • Working knowledge of Oracle databases and Oracle EBS structures.

Preferred Skills:

  • Experience with Qlik Replicate , data replication , or data migration tools.

  • Familiarity with data governance , data quality frameworks , and metadata management .

  • Exposure to cloud-based architectures, Big Data platforms (e.g., Spark, Hive, Kafka), and distributed storage systems (e.g., HBase, MongoDB).

  • Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement.

QUALIFICATIONS

Why Join Cummins?

  • Opportunity to work with a global leader in power solutions and digital transformation.

  • Be part of a collaborative and inclusive team culture.

  • Access to cutting-edge data platforms and tools.

  • Exposure to enterprise-scale data challenges and finance domain expertise .

  • Drive impact through data innovation and process improvement .

Job Systems/Information Technology

Organization Cummins Inc.

Role Category Remote

Job Type Exempt - Experienced

ReqID 2414992

Relocation Package No

DirectEmployers