Quick Summary
Partnering with BI to design, build, and iterate on analytics models in Snowflake using dbt Owning the end-to-end lifecycle of data models, from intake and development to testing, deployment,
GiveCampus is the world's leading fundraising platform for non-profit educational institutions. Trusted by millions of donors and 1,300+ colleges, universities, and K-12 schools, our mission is to help advance the quality, the affordability, and the accessibility of education. At our current pace, we will facilitate $100 billion in charitable giving over the next decade–enough money to send more than 1 million students to college, tuition-free.
GiveCampus is backed by leading investors including Y Combinator, but we’re also practitioners of Sustainable Growth: we’ve made the Inc. 5000 list of America's fastest-growing private companies each of the last five years and we’ve been profitable nine of the last 10. In 2025, we celebrated a $140 million growth investment that included a major liquidity event for GiveCampus employees–the second in less than three years.
Our purpose-driven team of 130+ is located in 30+ states across the US: team members work from anywhere they choose. We have a beautiful 12,000sf office in Washington, DC that is available for people to use whenever they want, and we regularly organize team meet-ups, visit partner institutions, and host retreats in various locations.
While we operate at meaningful scale, we’re still small relative to the commercial and social good opportunities in front of us. Every GiveCampus employee plays a meaningful role in shaping what comes next, and we're growing the team in support of our ambitious plans–including a $100 million investment in AI product development. If you believe in the transformative power of education and want to join a fast-growing, mission-driven company, you’ll fit right in.
Responsibilities
~1 min read- →Partnering with BI to design, build, and iterate on analytics models in Snowflake using dbt
- →Owning the end-to-end lifecycle of data models, from intake and development to testing, deployment, and documentation
- →Translating business requirements into clean, performant SQL and dbt models that enable self-serve reporting
- →Maintaining and improving our dbt project structure, testing framework, and CI/CD practices
- →Monitoring pipeline health and serving as a first responder for data quality and freshness issues across Airbyte, Fivetran, Prefect, and Snowflake
- →Managing existing data integrations and building new pipelines using Prefect for orchestration
- →Improving data observability and alerting to ensure reliability and adherence to SLAs for business-critical reporting
- →Building and maintaining semantic models in Snowflake that power LLM-driven product features
- →Developing evaluation pipelines (including LLM-as-judge patterns) to monitor output quality and prevent degradation
- →Collaborating with Data Science and ML teams to ensure clean, well-modeled data is available for training and inference workloads
- →Leveraging AI-assisted development tools to improve speed and efficiency, and identifying opportunities to automate repetitive data engineering tasks
- Strong experience writing production-grade SQL and working with modern data warehouses (e.g., Snowflake)
- Hands-on experience with dbt for data modeling, testing, and documentation
- Familiarity with data pipeline and orchestration tools such as Prefect, Airbyte, or Fivetran
- Experience designing and maintaining reliable, scalable data systems with a focus on data quality and observability
- Ability to translate ambiguous business problems into structured data solutions
- Comfort working cross-functionally in a fast-paced, collaborative environment
- Experience supporting analytics, reporting, and/or machine learning use cases
- A proactive mindset with strong ownership and attention to detail
Nice to Have
~1 min read- Experience building semantic layers or data models that support AI/LLM applications
- Familiarity with evaluation frameworks for LLM outputs (e.g., LLM-as-judge patterns)
- Experience implementing CI/CD workflows for data projects
- Exposure to data observability tools and best practices
- Experience in a SaaS or mission-driven organization
- Interest in leveraging AI tools to accelerate development and improve workflows
Be sure to keep an eye on your spam and promotions boxes in case our emails end up there!
At GiveCampus, we value diversity and we pledge to foster an environment of support, inclusivity, and learning, both on the job and throughout the application process. In this spirit, we encourage candidates of all backgrounds to apply.
GiveCampus is an Equal Opportunity Employer. Applicants and employees are not discriminated against because of race, color, creed, sex, sexual orientation, gender identity or expression, age, religion, national origin, citizenship status, disability, ancestry, marital status, veteran status, medical condition or any protected category prohibited by local, state or federal laws.
If you feel like you don't meet all of the requirements for this role, please apply anyways. We know confidence gaps and imposter syndrome often get in the way of connecting with incredible people, and we don't want them to prevent us from meeting you.
Location & Eligibility
Listing Details
- Posted
- April 27, 2026
- First seen
- April 27, 2026
- Last seen
- May 3, 2026
Posting Health
- Days active
- 5
- Repost count
- 0
- Trust Level
- 37%
- Scored at
- May 3, 2026
Signal breakdown
Please let Givecampus know you found this job on Jobera.
3 other jobs at Givecampus
View all →Explore open roles at Givecampus.
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.