Senior Data Engineer
Quick Summary
Cybercrime is rising, reaching record highs in 2024. According to the FBI's IC3 report total losses exceeded $16 billion. With investment fraud and BEC scams at the forefront,
CertifID is the wire fraud prevention platform protecting real estate closings. Every transaction we secure generates data: identity signals, verification events, behavioral patterns, payment flows. That data is how we detect fraud, how our customers measure risk, and how the business operates. You will own the systems that make it trustworthy, fast, and useful.
Design, build, and operate the core data infrastructure: data lake, warehouse, orchestration, observability, and governance, using declarative configuration and infrastructure as code (Terraform or equivalent) so the platform is reproducible and auditable
Partner with platform and domain teams to design ingestion pipelines and implement declarative configuration for data sources across the stack
Architect the transformation layer: dimensional models, aggregation strategies, and incremental materialization patterns that balance query performance against pipeline cost at scale
Own streaming and near-real-time data flows for fraud signal propagation, transaction status events, and verification webhooks, with the reliability expectations those require
Build for scale: partition strategies, clustering, late-arriving data handling, and backfill patterns that hold up when data volume doubles
Own the source-of-truth models for the metrics the business runs on: ARR, NRR, churn, transaction volume, fraud detection rates, customer health scores, and operational throughput
Make the numbers defensible: when a business leader challenges a metric, you can walk them through exactly how it is calculated, what is excluded, and why
Partner with Product, Finance, CS, and GTM to translate business questions into data models and help teams measure what actually matters
Write production-grade Python and SQL: modular, tested, version-controlled, and reviewable by someone who was not in the room when you wrote it
Implement CI/CD pipelines for data systems: automated testing, schema change detection, data contract validation, deployment gates, and cost optimization and performance tuning as ongoing practice, not one-time projects
6+ years in data engineering with primary, end-to-end ownership of a production data platform, not a supporting role on a large team
Direct experience designing and operating streaming or near-real-time pipelines (Kafka, Kinesis, Pub/Sub, Flink, or equivalent) at production scale, including debugging failures under load
Hands-on production experience with cloud-based data platforms (Snowflake, BigQuery, Redshift, Databricks, or equivalent) and a production-grade orchestrator (Airflow, Dagster, Prefect, or equivalent)
Expert SQL and distributed systems: window functions, recursive CTEs, query plan analysis, query concurrency management, and optimization strategies that go beyond adding an index
Strong Python for data engineering: production-quality pipeline code with error handling, idempotency, retry logic, and test coverage; Go is a meaningful plus
Dimensional modeling mastery: you understand the tradeoffs between normalized and denormalized designs, when SCDs are the right tool, and how incremental strategies affect downstream query semantics
Event-driven architecture fundamentals: exactly-once semantics, consumer group management, backpressure handling, offset management, and the operational realities of keeping a streaming pipeline healthy
Warehouse internals: clustering keys, materialized views, partition pruning, and cost optimization strategies that keep query costs from compounding as data volume grows
You instrument, measure, and verify that your work produced the outcome it was supposed to
You make architectural decisions independently, communicate outwardly, and document the reasoning so the decision survives you
You have joined teams where the data was a mess, and you shipped before the situation was fully resolved, because waiting for perfection was not an option
Location & Eligibility
Listing Details
- Posted
- February 4, 2026
- First seen
- April 7, 2026
- Last seen
- April 27, 2026
Posting Health
- Days active
- 20
- Repost count
- 0
- Trust Level
- 32%
- Scored at
- April 28, 2026
Signal breakdown
Please let Certifid know you found this job on Jobera.
3 other jobs at Certifid
View all →Explore open roles at Certifid.
Similar Data Engineer jobs
View all →Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.
