Big Data Engineer
Big Data Engineer
Unlock your potential with Dassault Systèmes, a global leader in Scientific Software Engineering as a Big Data Engineer in Pune, Maharashtra!
Role Description & Responsibilities:
- Data Pipeline Development: Design, develop, and maintain robust ETL pipelines for batch and real-time data ingestion, processing, and transformation using Spark, Kafka, and Python.
- Data Architecture: Build and optimize scalable data architectures, including data lakes, data marts, and data warehouses, to support business intelligence, reporting, and machine learning.
- Data Governance: Ensure data reliability, integrity, and governance by enabling accurate, consistent, and trustworthy data for decision-making.
- Collaboration: Work closely with data analysts, data scientists, and business stakeholders to gather requirements, identify inefficiencies, and deliver scalable and impactful data solutions.
- Optimization: Develop efficient workflows to handle large-scale datasets, improving performance and minimizing downtime.
- Documentation: Create detailed documentation for data processes, pipelines, and architecture to support seamless collaboration and knowledge sharing.
- Innovation: Contribute to a thriving data engineering culture by introducing new tools, frameworks, and best practices to improve data processes across the organization.
Qualifications:
- Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
- Professional Experience: 2–3 years of experience in data engineering, with expertise in designing and managing complex ETL pipelines.
- Technical Skills:
· Proficiency in Python, PySpark, and Spark SQL for distributed and real-time data processing.
· Deep understanding of real-time streaming systems using Kafka.
· Experience with data lake and data warehousing technologies (Hadoop, HDFS, Hive, Iceberg, Apache Spark).
· Strong knowledge of relational and non-relational databases (SQL, NoSQL).
· Experience in cloud and on-premises environments for building and managing data pipelines
· Experience with ETL tools like SAP BODS or similar platforms.
· Knowledge of reporting tools like SAP BO for designing dashboards and reports.
· Hands-on experience building end-to-end data frameworks and working with data lakes.
- Analytical and Problem-Solving Skills: Ability to translate complex business requirements into scalable and efficient technical solutions.
- Collaboration and Communication: Strong communication skills and the ability to work with cross-functional teams, including analysts, scientists, and stakeholders.
- Location: Willingness to work from Pune (on-site).
What is in it for you?
· Work for one of the biggest software companies
· Work in a culture of collaboration and innovation
· Opportunities for personal development and career progression
· Chance to collaborate with various internal users of Dassault Systèmes and also stakeholders of various internal and partner projects
Inclusion statement
Присоединяйтесь к Dassault Systèmes, компании 3DEXPERIENCE Сompany Виртуальными вселенными 3DEXPERIENCE компании Dassault Systèmes все становится возможным! Мы обслуживаем 230 000 клиентов в 11 отраслях: от высоких технологий и моды до естественных наук и транспорта... Мы помогаем предприятиям и людям во всем мире создавать устойчивые инновации с учетом потребностей сегодняшнего и завтрашнего дня. Присоединяйтесь к быстрорастущей компании-лидеру, где работает около 20 000 талантливых специалистов.
Хотите узнать больше?
Посетите другие разделы нашего веб-сайта, чтобы узнать больше.
Студенты и выпускники
Станьте частью нашего персонала будущего — ознакомьтесь с возможностями стажировки и трудоустройства.
Ваш путь получения работы
Узнайте о пути получения работы.
Наша культура и ценности
Познакомьтесь с нашей культурой и ценностями.