Big Data Engineer
Big Data Engineer
Unlock your potential with Dassault Systèmes, a global leader in Scientific Software Engineering as a Big Data Engineer in Pune, Maharashtra!
Role Description & Responsibilities:
- Data Pipeline Development: Design, develop, and maintain robust ETL pipelines for batch and real-time data ingestion, processing, and transformation using Spark, Kafka, and Python.
- Data Architecture: Build and optimize scalable data architectures, including data lakes, data marts, and data warehouses, to support business intelligence, reporting, and machine learning.
- Data Governance: Ensure data reliability, integrity, and governance by enabling accurate, consistent, and trustworthy data for decision-making.
- Collaboration: Work closely with data analysts, data scientists, and business stakeholders to gather requirements, identify inefficiencies, and deliver scalable and impactful data solutions.
- Optimization: Develop efficient workflows to handle large-scale datasets, improving performance and minimizing downtime.
- Documentation: Create detailed documentation for data processes, pipelines, and architecture to support seamless collaboration and knowledge sharing.
- Innovation: Contribute to a thriving data engineering culture by introducing new tools, frameworks, and best practices to improve data processes across the organization.
Qualifications:
- Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
- Professional Experience: 2–3 years of experience in data engineering, with expertise in designing and managing complex ETL pipelines.
- Technical Skills:
· Proficiency in Python, PySpark, and Spark SQL for distributed and real-time data processing.
· Deep understanding of real-time streaming systems using Kafka.
· Experience with data lake and data warehousing technologies (Hadoop, HDFS, Hive, Iceberg, Apache Spark).
· Strong knowledge of relational and non-relational databases (SQL, NoSQL).
· Experience in cloud and on-premises environments for building and managing data pipelines
· Experience with ETL tools like SAP BODS or similar platforms.
· Knowledge of reporting tools like SAP BO for designing dashboards and reports.
· Hands-on experience building end-to-end data frameworks and working with data lakes.
- Analytical and Problem-Solving Skills: Ability to translate complex business requirements into scalable and efficient technical solutions.
- Collaboration and Communication: Strong communication skills and the ability to work with cross-functional teams, including analysts, scientists, and stakeholders.
- Location: Willingness to work from Pune (on-site).
What is in it for you?
· Work for one of the biggest software companies
· Work in a culture of collaboration and innovation
· Opportunities for personal development and career progression
· Chance to collaborate with various internal users of Dassault Systèmes and also stakeholders of various internal and partner projects
Inclusion statement
Dassault Systèmes es un catalizador del progreso humano. Proporcionamos a empresas y personas entornos virtuales colaborativos para imaginar innovaciones sostenibles. Al crear experiencias gemelas virtuales del mundo real con nuestra plataforma y aplicaciones 3DEXPERIENCE, aportamos valor a más de 350.000 clientes de todos los tamaños y sectores, en más de 150 países. ¡Únete a nuestra comunidad mundial de más de 23.800 personas apasionadas!
¿Quieres saber más?
Visita otras secciones de nuestra web para obtener más información.
Estudiantes y graduados
¡Forma parte de nuestro futuro! Consulta nuestras ofertas de empleo y oportunidades de prácticas.
El camino hacia la contratación
Infórmate sobre cómo será el camino hacia tu contratación.
Nuestros valores y nuestra cultura
Descubre nuestros valores y nuestra cultura.