Sr. Data Engineer
Quick Summary
Snowflake, Airflow, DBT, and AWS data services. Expertise in applying Python and SQL to execute complex data operations, customize ETL/ELT processes,
Akido builds AI-powered doctors. Akido is the first AI-native care provider, combining cutting-edge technology with a nationwide medical network to address America’s physician shortage and make exceptional healthcare universal. Its AI empowers doctors to deliver faster, more accurate, and more compassionate care.
Serving 500K+ patients across California, Rhode Island, and New York, Akido offers primary and specialty care in 26 specialties—from serving unhoused communities in Los Angeles to ride-share drivers in New York.
Founded in 2015 (YC W15), Akido is expanding its risk-bearing care models and scaling ScopeAI, its breakthrough clinical AI platform. Read more about Akido’s $60M Series B. More info at Akidolabs.com.
Akido is hiring a Senior Data Engineer to design, build, and own our modern data platform.
Responsibilities
~1 min read- →Design, build, and own data pipelines using DBT, Airflow, and Snowflake
- →Lead architectural design sessions for the modern data stack, focusing on solutions that seamlessly integrate with our technology stack, which encompasses Snowflake, Airflow, DBT, and AWS data services
- →Work with our data science and product management teams to design, rapidly prototype, and productize new data product ideas and capabilities
- →Collaborate cross-functionally with product, data science, and engineering leadership
- →Participate in code reviews to ensure code quality and distribute knowledge
- →Mentor teammates and contribute to a culture of continuous learning
- Minimum 5 years of professional software engineering experience; bachelor’s degree in computer science or a related field (or equivalent practical experience)
- 4+ years of experience in a data engineering role with deep exposure to modern data stacks including (or similar): Snowflake, Airflow, DBT, and AWS data services.
- Expertise in applying Python and SQL to execute complex data operations, customize ETL/ELT processes, and perform advanced data transformations across the platform
- Experience partnering with analytics and business stakeholders to translate requirements into scalable data solutions
- Strong experience with version control systems (GitHub, GitLab)
- Demonstrated ability to effectively leverage AI coding agents (such as Cursor or Copilot) while maintaining high standards for code quality, security, and correctness
- Top notch communication skills
- A mission-oriented mindset and a strong desire to continuously learn and improve
- AWS environment familiarity
Requirements
~1 min read- Stock-options package
- Health benefits include medical, dental and vision
- 401K
- Long-term disability
- Unlimited PTO
- Life insurance
- Paid Leave Program
Akido Labs, Inc. is an equal opportunity employer, and we encourage qualified applicants of every background, ability, and life experience to contact us about appropriate employment opportunities.
Listing Details
- Posted
- April 10, 2026
- First seen
- March 26, 2026
- Last seen
- April 12, 2026
Posting Health
- Days active
- 16
- Repost count
- 0
- Trust Level
- 83%
- Scored at
- April 12, 2026
Signal breakdown
Please let Akidolabs know you found this job on Jobera.
4 other jobs at Akidolabs
View all →Explore open roles at Akidolabs.
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.