Certifid
Certifid2mo ago

Senior Data Engineer

United StatesUnited StatesRemoteFull-timesenior
Data EngineeringData EngineerDataData & AI
0 views0 saves0 applied

Quick Summary

Overview

Cybercrime is rising, reaching record highs in 2024. According to the FBI's IC3 report total losses exceeded $16 billion. With investment fraud and BEC scams at the forefront,

Technical Tools
Data EngineeringData EngineerDataData & AI
Cybercrime is rising, reaching record highs in 2024. According to the FBI's IC3 report total losses exceeded $16 billion. With investment fraud and BEC scams at the forefront, the message is clear: the real estate sector remains a lucrative target for cybercriminals. At CertifID, we take this threat seriously and provide a secure platform that verifies the identities of parties involved in transactions, authenticates wire transfer instructions, and detects potential fraud attempts. Our technology is designed to mitigate risks and ensure that every transaction is conducted with confidence and peace of mind.
 
We know we couldn’t take on this challenge without our incredible team. We have been recognized as one of the Best Startups to Work for in Austin, made the Inc. 5000 list, and won Best Culture by Purpose Jobs two years in a row. We are guided by our core values and our vision of a world without wire fraud. We offer a dynamic work environment where you can contribute to meaningful impact and be part of a team dedicated to enhancing security and fighting fraud.

CertifID is the wire fraud prevention platform protecting real estate closings. Every transaction we secure generates data: identity signals, verification events, behavioral patterns, payment flows. That data is how we detect fraud, how our customers measure risk, and how the business operates. You will own the systems that make it trustworthy, fast, and useful.

  • Design, build, and operate the core data infrastructure: data lake, warehouse, orchestration, observability, and governance, using declarative configuration and infrastructure as code (Terraform or equivalent) so the platform is reproducible and auditable

  • Partner with platform and domain teams to design ingestion pipelines and implement declarative configuration for data sources across the stack

  • Architect the transformation layer: dimensional models, aggregation strategies, and incremental materialization patterns that balance query performance against pipeline cost at scale

  • Own streaming and near-real-time data flows for fraud signal propagation, transaction status events, and verification webhooks, with the reliability expectations those require

  • Build for scale: partition strategies, clustering, late-arriving data handling, and backfill patterns that hold up when data volume doubles

  • Own the source-of-truth models for the metrics the business runs on: ARR, NRR, churn, transaction volume, fraud detection rates, customer health scores, and operational throughput

  • Make the numbers defensible: when a business leader challenges a metric, you can walk them through exactly how it is calculated, what is excluded, and why

  • Partner with Product, Finance, CS, and GTM to translate business questions into data models and help teams measure what actually matters 

  • Write production-grade Python and SQL: modular, tested, version-controlled, and reviewable by someone who was not in the room when you wrote it

  • Implement CI/CD pipelines for data systems: automated testing, schema change detection, data contract validation, deployment gates, and cost optimization and performance tuning as ongoing practice, not one-time projects

  • 6+ years in data engineering with primary, end-to-end ownership of a production data platform, not a supporting role on a large team

  • Direct experience designing and operating streaming or near-real-time pipelines (Kafka, Kinesis, Pub/Sub, Flink, or equivalent) at production scale, including debugging failures under load

  • Hands-on production experience with cloud-based data platforms (Snowflake, BigQuery, Redshift, Databricks, or equivalent) and a production-grade orchestrator (Airflow, Dagster, Prefect, or equivalent)

  • Expert SQL and distributed systems: window functions, recursive CTEs, query plan analysis, query concurrency management, and optimization strategies that go beyond adding an index

  • Strong Python for data engineering: production-quality pipeline code with error handling, idempotency, retry logic, and test coverage; Go is a meaningful plus

  • Dimensional modeling mastery: you understand the tradeoffs between normalized and denormalized designs, when SCDs are the right tool, and how incremental strategies affect downstream query semantics

  • Event-driven architecture fundamentals: exactly-once semantics, consumer group management, backpressure handling, offset management, and the operational realities of keeping a streaming pipeline healthy

  • Warehouse internals: clustering keys, materialized views, partition pruning, and cost optimization strategies that keep query costs from compounding as data volume grows

  • You instrument, measure, and verify that your work produced the outcome it was supposed to

  • You make architectural decisions independently, communicate outwardly, and document the reasoning so the decision survives you

  • You have joined teams where the data was a mess, and you shipped before the situation was fully resolved, because waiting for perfection was not an option

  • Flexible vacation
  • 12 company-paid holidays
  • 10 paid sick days
  • No work on your birthday
  • Health, dental, and vision Insurance (including a $0 option)
  • 401(k) with matching, and no waiting period
  • Equity
  • Life insurance
  • Generous parental paid leave
  • Wellness reimbursement of $300/year
  • Remote worker reimbursement of $300/year
  • Professional development reimbursement
  • Competitive pay
  • An award-winning culture
  • Location & Eligibility

    Where is the job
    United States
    Remote within one country
    Who can apply
    US
    Listed under
    United States

    Listing Details

    Posted
    February 4, 2026
    First seen
    April 7, 2026
    Last seen
    April 27, 2026

    Posting Health

    Days active
    20
    Repost count
    0
    Trust Level
    32%
    Scored at
    April 28, 2026

    Signal breakdown

    freshnesssource trustcontent trustemployer trust
    Certifid
    Employees
    125
    Founded
    2017
    View company profile
    Newsletter

    Stay ahead of the market

    Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

    A
    B
    C
    D
    Join 12,000+ marketers

    No spam. Unsubscribe at any time.

    CertifidSenior Data Engineer