AI Engineer
About the Team:
The AI Engineer reports to the Senior Director, Data Management & Analytics works within the Information Systems/Enterprise IT & Strategy organization with strong expertise in AWS cloud services, AI, and large-scale data engineering to lead the design, development, and deployment of AI-driven solutions. This is a unique opportunity for an engineer to work on the bleeding edge of Generative AI. In this role, you will move beyond simple chatbots to design, build, and deploy autonomous AI agents capable of complex reasoning and executing tasks across our enterprise business applications. You will work exclusively within the AWS ecosystem, leveraging Bedrock for LLMs and orchestrating workflows using frameworks like LangChain/LangGraph/Strands.
Key Responsibilities:
- AI Agent Development: Design, develop, and iterate on stateful AI agents and complex RAG (Retrieval-Augmented Generation) pipelines using LangChain, LangGraph, or Strands.
- LLM Integration: Leverage AWS Bedrock as the primary source for foundational models, optimizing prompt engineering and model selection for specific business use cases.
- Data for AI: You will heavily utilize AWS Redshift/RDS (data warehousing), and S3 (data lake).
- NLP Implementation: Apply Natural Language Processing techniques to extract insights from application data and business documents.
- AWS Native Deployment: Move prototypes into production using AWS best practices. This involves containerizing applications (Docker/ECR), deploying on Amazon ECS, and utilizing serverless architecture (Lambda) for event-driven agent tasks.
- Security & Governance: Implement secure IAM roles and policies to ensure agents have appropriate least-privilege access to data and services.
Collaboration: Work closely with senior engineers, Managers, and business stakeholders to translate complex business requirements into technical AI solutions
Education: Bachelor’s or master’s degree in computer science, Data Science, Engineering, or equivalent practical experience.
Python Proficiency: High proficiency in Python coding. You understand asynchronous programming, data structures, and how to build clean, modular APIs (e.g., FastAPI or Flask).
Database Expertise: Strong command of SQL for complex querying and data manipulation. Understanding of data modeling concepts for both relational (RDS) and analytical (Redshift) environments.
GenAI Frameworks: demonstrable hands-on experience building applications with LLM orchestration frameworks
AWS Fundamentals: A solid foundational understanding of core AWS services (Compute, Storage, Networking, Security).
You should have experience around S3, Lambda, and IAM and how they interact.
Communication Skills: Excellent verbal and written communication skills.
You must be able to articulate technical challenges and solutions clearly to non-technical team members.
Technology Stack:
Languages: Python (Primary), SQL.
AI/Agent Frameworks: LangChain, LangGraph, Strands.
LLMs: AWS Bedrock (Anthropic Claude, Titan, etc.).
Data Storage: AWS Redshift, Amazon RDS, Amazon S3.
Compute & Deployment: Amazon ECS, AWS Fargate, AWS Lambda, Amazon ECR.
Security: AWS IAM
Inclusion statement
Dassault Systèmes is a catalyst for human progress. We provide business and people with collaborative virtual environments to imagine sustainable innovations. By creating virtual twin experiences of the real world with our 3DEXPERIENCE platform and applications, we bring value to more than 350,000 customers of all sizes, in all industries, in more than 150 countries. Join our global community of more than 23,800 passionate individuals!
Want to learn more?
Visit the other sections of our website to find out more.
Our Culture & Values
Discover our culture and values.
Your Recruitment Journey
Get to know about your recruitment journey.
Students & Graduates
Be part of the future of our workforce – check out internship and job opportunities.