Big Data Engineer
Big Data Engineer
Unlock your potential with Dassault Systèmes, a global leader in Scientific Software Engineering as a Big Data Engineer in Pune, Maharashtra!
Role Description & Responsibilities:
- Data Pipeline Development: Design, develop, and maintain robust ETL pipelines for batch and real-time data ingestion, processing, and transformation using Spark, Kafka, and Python.
- Data Architecture: Build and optimize scalable data architectures, including data lakes, data marts, and data warehouses, to support business intelligence, reporting, and machine learning.
- Data Governance: Ensure data reliability, integrity, and governance by enabling accurate, consistent, and trustworthy data for decision-making.
- Collaboration: Work closely with data analysts, data scientists, and business stakeholders to gather requirements, identify inefficiencies, and deliver scalable and impactful data solutions.
- Optimization: Develop efficient workflows to handle large-scale datasets, improving performance and minimizing downtime.
- Documentation: Create detailed documentation for data processes, pipelines, and architecture to support seamless collaboration and knowledge sharing.
- Innovation: Contribute to a thriving data engineering culture by introducing new tools, frameworks, and best practices to improve data processes across the organization.
Qualifications:
- Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
- Professional Experience: 2–3 years of experience in data engineering, with expertise in designing and managing complex ETL pipelines.
- Technical Skills:
· Proficiency in Python, PySpark, and Spark SQL for distributed and real-time data processing.
· Deep understanding of real-time streaming systems using Kafka.
· Experience with data lake and data warehousing technologies (Hadoop, HDFS, Hive, Iceberg, Apache Spark).
· Strong knowledge of relational and non-relational databases (SQL, NoSQL).
· Experience in cloud and on-premises environments for building and managing data pipelines
· Experience with ETL tools like SAP BODS or similar platforms.
· Knowledge of reporting tools like SAP BO for designing dashboards and reports.
· Hands-on experience building end-to-end data frameworks and working with data lakes.
- Analytical and Problem-Solving Skills: Ability to translate complex business requirements into scalable and efficient technical solutions.
- Collaboration and Communication: Strong communication skills and the ability to work with cross-functional teams, including analysts, scientists, and stakeholders.
- Location: Willingness to work from Pune (on-site).
What is in it for you?
· Work for one of the biggest software companies
· Work in a culture of collaboration and innovation
· Opportunities for personal development and career progression
· Chance to collaborate with various internal users of Dassault Systèmes and also stakeholders of various internal and partner projects
Déclaration de diversité
Dassault Systèmes est un accélérateur de progrès humain. Nous proposons aux entreprises et aux particuliers des environnements virtuels collaboratifs permettant d’imaginer des innovations durables. Grâce aux jumeaux virtuels d’expérience du monde réel qu’ils créent avec la plateforme 3DEXPERIENCE et ses applications, Dassault Systèmes est un créateur de valeur, au service de plus de 350,000 clients de toutes tailles et de tous secteurs d’activité, dans plus de 150 pays. Rejoignez notre communauté mondiale de plus de 23,800 personnes passionnées !
Vous voulez en savoir plus ?
Pour en savoir plus, consultez toutes les rubriques de notre site web.
Étudiants et jeunes diplômés
Rejoignez-nous et consultez nos offres de stage et d'emploi.
Parcours de recrutement
En savoir plus sur votre parcours de recrutement.
Nos valeurs et notre culture
Découvrez nos valeurs et notre culture.