Data Engineer II
Quick Summary
About Dimagi Dimagi is an award-winning social enterprise and a certified B-corp and Benefit Corporation.
About the Role
~1 min readThe Data Engineer II will be part of Dimagi’s US Solutions Division Data & Analytics team, a group of engineers and data specialists responsible for building, maintaining, and evolving Dimagi’s Data Platform in support of current and future project work. The primary technologies used by the current data platform are Snowflake, Tableau, and various AWS cloud tools. In this role, you will contribute hands-on to the design, implementation, and operation of data pipelines, warehouse transformations, data visualizations, and supporting infrastructure, while working closely with technical leadership to ensure platform reliability, scalability, and alignment with business needs. The data systems you help build and maintain will directly support public health and human services programs, enabling frontline teams and government partners to deliver care and services more effectively.
This position is well suited for someone who enjoys hands-on technical work in a small, collaborative environment. As a member of a lean team, you will be expected to work across functional areas, adapt quickly to new problem spaces, and contribute meaningfully to data systems that support real-world service delivery and decision-making. This role assumes comfort using AI-assisted tools to support analysis, documentation, troubleshooting, and learning in a complex technical environment.
Responsibilities
~2 min read- →Contribute to the technical integrity and evolution of the Data Platform tech stack, working closely with other Data Engineers, the Director of Technology, and the USS Tech Lead.
- →Design and implement core features and enhancements within the Data Platform, including contributing to technical specifications, conducting targeted technical research, and translating requirements into production-ready solutions.
- →Responsible for executing and maintaining DevOps workflows supporting the Data Platform, including performance monitoring, platform upgrades, deployment frameworks, and operational improvements, with guidance and mentorship from more senior Data Engineers as needed.
- →Use AI-assisted tools thoughtfully to accelerate development, debugging, documentation, and operational analysis, while understanding and validating outputs to ensure correctness, reliability, and security.
- →Build and maintain robust data extraction, loading, and transformation processes for both Dimagi managed (i.e. CommCare) and external data sources, enabling efficient, reliable data pipelines and their long-term development and operation using both SQL and Python scripting.
- →Design and develop data warehouse transformations, using SQL-based approaches and supplementary tools such as dbt.
- →Collaborate with internal teams and external partners on the design and implementation of enterprise data architectures based on industry standards and partner specific analytics needs.
- →Conduct ad hoc analyses and support the development of business intelligence outputs, including dashboards and visualizations using Tableau and other tools.
- 2–5 years of experience in data engineering or a similar technical role, with a proven track record of designing and evolving scalable data systems.
- Experience building maintainable, long-term technical solutions using software development best practices (version control, testing, and iterative development).
- Hands-on expertise in building and managing production-grade pipelines using ETL/ELT tools (e.g., dbt, Airflow, Prefect, Fivetran, or Talend).
- Strong proficiency with cloud-based data platforms (AWS, Snowflake, etc.) and a diverse range of data ingestion, processing, and storage technologies.
- Expert-level SQL for complex data engineering and analysis, paired with proficiency in Python and associated data-oriented toolkits.
- A deep understanding of dimensional modeling concepts ((e.g. OLAP cubes, star schemas, kimball architecture vs. alternatives like inmon)
- Proven ability to partner with technical stakeholders to clarify requirements and deliver effective, end-to-end data solutions.
- Proficiency in using AI-assisted tools for code generation, debugging, and optimization, with the ability to rapidly adapt to new schemas and tools in a fast-paced environment.
- Comfortable working "in the trenches" of production systems to test, iterate, and optimize operational workflows.
- Eligible to work in the United States
Nice to Have
~1 min read- Experience in enterprise data architecture, service-oriented frameworks, data integration and harmonization, data strategy and governance, high-performing data lakes, data operations and delivery and data ingestion frameworks supporting batch/real-time
- Experience writing and maintaining production ready code in a high level programming language (Python, Java, C++ etc.)
- Experience with data analysis software (Jupyter Notebooks, R, etc.) and data visualization tools (Tableau, Power BI, Superset, etc.).
- Healthcare experience: either in healthcare data or public health data collection methodologies and workflows
- Experience and comfort working independently with partners for requirements gathering and solution development in an agile software development environment, using JIRA and Asana to manage tasks between technical and client-facing teams
What We Offer
~2 min readListing Details
- Posted
- April 1, 2026
- First seen
- April 1, 2026
- Last seen
- April 26, 2026
Posting Health
- Days active
- 24
- Repost count
- 0
- Trust Level
- 23%
- Scored at
- April 26, 2026
Signal breakdown
Please let Dimagi know you found this job on Jobera.
3 other jobs at Dimagi
View all →Explore open roles at Dimagi.
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.
