Big Data Engineer
Big Data Engineer
Unlock your potential with Dassault Systèmes, a global leader in Scientific Software Engineering as a Big Data Engineer in Pune, Maharashtra!
Role Description & Responsibilities:
- Data Pipeline Development: Design, develop, and maintain robust ETL pipelines for batch and real-time data ingestion, processing, and transformation using Spark, Kafka, and Python.
- Data Architecture: Build and optimize scalable data architectures, including data lakes, data marts, and data warehouses, to support business intelligence, reporting, and machine learning.
- Data Governance: Ensure data reliability, integrity, and governance by enabling accurate, consistent, and trustworthy data for decision-making.
- Collaboration: Work closely with data analysts, data scientists, and business stakeholders to gather requirements, identify inefficiencies, and deliver scalable and impactful data solutions.
- Optimization: Develop efficient workflows to handle large-scale datasets, improving performance and minimizing downtime.
- Documentation: Create detailed documentation for data processes, pipelines, and architecture to support seamless collaboration and knowledge sharing.
- Innovation: Contribute to a thriving data engineering culture by introducing new tools, frameworks, and best practices to improve data processes across the organization.
Qualifications:
- Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
- Professional Experience: 2–3 years of experience in data engineering, with expertise in designing and managing complex ETL pipelines.
- Technical Skills:
· Proficiency in Python, PySpark, and Spark SQL for distributed and real-time data processing.
· Deep understanding of real-time streaming systems using Kafka.
· Experience with data lake and data warehousing technologies (Hadoop, HDFS, Hive, Iceberg, Apache Spark).
· Strong knowledge of relational and non-relational databases (SQL, NoSQL).
· Experience in cloud and on-premises environments for building and managing data pipelines
· Experience with ETL tools like SAP BODS or similar platforms.
· Knowledge of reporting tools like SAP BO for designing dashboards and reports.
· Hands-on experience building end-to-end data frameworks and working with data lakes.
- Analytical and Problem-Solving Skills: Ability to translate complex business requirements into scalable and efficient technical solutions.
- Collaboration and Communication: Strong communication skills and the ability to work with cross-functional teams, including analysts, scientists, and stakeholders.
- Location: Willingness to work from Pune (on-site).
What is in it for you?
· Work for one of the biggest software companies
· Work in a culture of collaboration and innovation
· Opportunities for personal development and career progression
· Chance to collaborate with various internal users of Dassault Systèmes and also stakeholders of various internal and partner projects
Inclusion statement
Dassault Systèmes è un catalizzatore del progresso umano. Forniamo alle aziende e alle persone ambienti collaborativi virtuali per immaginare innovazioni sostenibili. Creando gemelli virtuali del mondo reale con la nostra piattaforma e le nostre applicazioni 3DEXPERIENCE, creiamo esperienze e offriamo valore a più di 350.000 clienti di tutte le dimensioni, in tutti i settori, in oltre 150 paesi. Unisciti alla nostra comunità globale di oltre 23.800 persone appassionate!
Vuoi saperne di più?
Visita le altre sezioni del nostro sito per avere maggiori informazioni.
Studenti e laureati
Entra a far parte del futuro della nostra forza lavoro: scopri le nostre opportunità di stage e di lavoro.
Il tuo percorso di selezione
Scopri quale sarà il tuo percorso di selezione.
La nostra cultura e i nostri valori
Scopri la nostra cultura e i nostri valori.