Big Data Engineer
Big Data Engineer
Unlock your potential with Dassault Systèmes, a global leader in Scientific Software Engineering as a Big Data Engineer in Pune, Maharashtra!
Role Description & Responsibilities:
- Data Pipeline Development: Design, develop, and maintain robust ETL pipelines for batch and real-time data ingestion, processing, and transformation using Spark, Kafka, and Python.
- Data Architecture: Build and optimize scalable data architectures, including data lakes, data marts, and data warehouses, to support business intelligence, reporting, and machine learning.
- Data Governance: Ensure data reliability, integrity, and governance by enabling accurate, consistent, and trustworthy data for decision-making.
- Collaboration: Work closely with data analysts, data scientists, and business stakeholders to gather requirements, identify inefficiencies, and deliver scalable and impactful data solutions.
- Optimization: Develop efficient workflows to handle large-scale datasets, improving performance and minimizing downtime.
- Documentation: Create detailed documentation for data processes, pipelines, and architecture to support seamless collaboration and knowledge sharing.
- Innovation: Contribute to a thriving data engineering culture by introducing new tools, frameworks, and best practices to improve data processes across the organization.
Qualifications:
- Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
- Professional Experience: 2–3 years of experience in data engineering, with expertise in designing and managing complex ETL pipelines.
- Technical Skills:
· Proficiency in Python, PySpark, and Spark SQL for distributed and real-time data processing.
· Deep understanding of real-time streaming systems using Kafka.
· Experience with data lake and data warehousing technologies (Hadoop, HDFS, Hive, Iceberg, Apache Spark).
· Strong knowledge of relational and non-relational databases (SQL, NoSQL).
· Experience in cloud and on-premises environments for building and managing data pipelines
· Experience with ETL tools like SAP BODS or similar platforms.
· Knowledge of reporting tools like SAP BO for designing dashboards and reports.
· Hands-on experience building end-to-end data frameworks and working with data lakes.
- Analytical and Problem-Solving Skills: Ability to translate complex business requirements into scalable and efficient technical solutions.
- Collaboration and Communication: Strong communication skills and the ability to work with cross-functional teams, including analysts, scientists, and stakeholders.
- Location: Willingness to work from Pune (on-site).
What is in it for you?
· Work for one of the biggest software companies
· Work in a culture of collaboration and innovation
· Opportunities for personal development and career progression
· Chance to collaborate with various internal users of Dassault Systèmes and also stakeholders of various internal and partner projects
Inclusion statement
Dassault Systèmes treibt den menschlichen Fortschritt voran. Mit unseren kollaborativen und virtuellen Umgebungen helfen wir Unternehmen und Menschen, nachhaltige Innovationen erlebbar zu machen. Unsere 3DEXPERIENCE Plattform ermöglicht die Erstellung virtueller Zwillinge der realen Welt, was Mehrwert für über 350.000 Kunden aller Größen und Branchen in mehr als 150 Ländern schafft. Werde Teil unseres globalen Netzwerks von über 23.800 engagierten Mitarbeitenden!
Möchten Sie mehr erfahren?
In den anderen Bereichen unserer Website finden Sie weitere Informationen.
Studierende und Absolventinnen/Absolventen
Seien Sie Teil der Zukunft unserer Arbeitskräfte - informieren Sie sich über Praktikums- und Jobangebote.
Ihr Weg in unser Team
Erfahren Sie, wie Sie Teil unseres Teams werden.
Unsere Kultur und Werte
Entdecken Sie unsere Kultur und Werte.