Senior Data Engineer
Quick Summary
Data Pipeline Development Design, build, and maintain robust ETL/ELT pipelines using Python and Airflow . Develop serverless workflows leveraging AWS Lambda for scalable event-driven data processing.
We are RAPP – world leaders in activating growth with precision and empathy at scale.
As a global, next-generation precision marketing agency we leverage data, creativity, technology, and empathy to foster client growth. We champion individuality in the marketing solutions we create, and in our workplace. We fight for solutions that adapt to the individual’s needs, beliefs, behaviors, and aspirations.
We foster an inclusive workplace that emphasizes personal well-being.
At RAPP, our fearless superconnectors help to create value from personal brand experiences by focusing on three key areas: connected data, connected content and connected decisioning.
Our data analysts identify who that person is, our strategists understand what they want, and our award-winning technologists and creatives know how to deliver it – ensuring we’re able to activate authentic customer connections for our clients.
Part of Omnicom’s Precision Marketing Group, RAPP is comprised of 2,000+ creatives, technologists, strategists, and data and marketing scientists across 15+ global markets.
We are looking for a Senior Data Engineer with deep expertise in building scalable, cloud-native data pipelines and platforms. The ideal candidate is highly skilled in Python, Apache Airflow, AWS Lambda, DynamoDB, and dbt, and has experience designing reliable data workflows that enable advanced analytics, reporting, and machine learning use cases. The ideal candidate will have strong attention to detail, a passion for information management, and the ability to work collaboratively with creative teams to enhance the efficiency and scalability of our asset workflows.
Responsibilities
~1 min read- →Data Pipeline Development
- →Design, build, and maintain robust ETL/ELT pipelines using Python and Airflow.
- →Develop serverless workflows leveraging AWS Lambda for scalable event-driven data processing.
- →Implement and optimize dbt models for analytics and transformations.
- →Data Architecture & Storage
- →Design schemas and manage data in DynamoDB and other cloud-native storage solutions.
- →Ensure high availability, scalability, and performance of data systems.
- →Integrate structured, semi-structured, and unstructured data sources.
- →Automation & Orchestration
- →Build workflow orchestration strategies using Airflow for scheduling and monitoring pipelines.
- →Automate infrastructure deployment and CI/CD pipelines for data services.
- →Quality & Governance
- →Implement data validation, testing, and monitoring frameworks.
- →Ensure compliance with security, privacy, and governance standards.
- →Collaboration & Leadership
- →Partner with analytics, product, and engineering teams to deliver reliable datasets.
- →Mentor junior engineers and enforce best practices in data engineering.
- →Actively contribute to improving team efficiency, scalability, and standards.
- 5–8+ years of experience in data engineering, software engineering, or a related role.
- Strong expertise in Python for data engineering and automation.
- Hands-on experience with Apache Airflow for orchestration.
- Proficiency with AWS Lambda and serverless design patterns.
- Solid experience with DynamoDB (schema design, performance tuning, scaling).
- Strong knowledge of dbt for transformation and analytics modeling.
- Experience with cloud environments (AWS preferred).
- Familiarity with CI/CD workflows, Git, and DevOps practices.
- Strong problem-solving and communication skills.
Requirements
~1 min read- Experience with other AWS services (S3, Glue, Redshift, Kinesis).
- Familiarity with data warehouse and data lake architectures.
- Exposure to real-time streaming and event-driven data pipelines.
- Knowledge of containerization (Docker, Kubernetes).
- Exceptional attention to detail and organizational skills.
- Strong written and verbal communication skills, with the ability to explain complex metadata systems to non-technical users.
- Ability to work collaboratively and cross-functionally with creative, marketing, and IT teams.
- Proactive problem-solver who can identify issues and suggest improvements.
- Time management skills with the ability to prioritize and manage multiple tasks in a fast-paced environment.
RAPP's current hybrid model is designed to enable in-person connections and collaboration that is core to our culture, while also supporting flexibility for all employees. As such, we have the option to work from home two days per week, if we'd like.
Requirements
~1 min read“As an EEO/Affirmative Action Employer all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status.”
NOTE: This job description is not intended to be all-inclusive. Employee may perform other related duties as negotiated to meet the ongoing needs of the organization.
Listing Details
- First seen
- March 26, 2026
- Last seen
- April 25, 2026
Posting Health
- Days active
- 30
- Repost count
- 0
- Trust Level
- 42%
- Scored at
- April 25, 2026
Signal breakdown
Please let Rapp know you found this job on Jobera.
3 other jobs at Rapp
View all →Explore open roles at Rapp.
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.
