Senior Data Engineer (Remote, Full-Time) [AS233]

IndiaIndiaRemoteFull-Time | Remotesenior
Data EngineerDataFrontend DeveloperSoftware EngineeringData & AI
0 views0 saves0 applied

Quick Summary

Overview

About Smart WorkingAt Smart Working, we believe your job should not only look right on paper but also feel right every day.

Technical Tools
Data EngineerDataFrontend DeveloperSoftware EngineeringData & AI

About Smart Working
At Smart Working, we believe your job should not only look right on paper but also feel right every day. This isn’t just another remote opportunity - it’s about finding where you truly belong, no matter where you are. From day one, you’re welcomed into a genuine community that values your growth and well-being.

Our mission is simple: to break down geographic barriers and connect skilled professionals with outstanding global teams and products for full-time, long-term roles. We help you discover meaningful work with teams that invest in your success, where you’re empowered to grow personally and professionally.

Join one of the highest-rated workplaces on Glassdoor and experience what it means to thrive in a truly remote-first world.

About the Role
We're looking for a Senior Data Engineer to become a cornerstone of our data platform team. This is a long-term, strategic role, not a short sprint. You'll be embedded in a collaborative engineering and analytics team, working across the full data lifecycle: ingestion, transformation, modelling, and surfacing insights through Looker. You'll work closely with stakeholders across commercial, product, and marketing to ensure data is reliable, scalable, and meaningful.

You'll be given real ownership. This is a role for someone who wants to shape standards, improve the architecture, and grow with a brand that takes its data seriously.

  • Design, build, and maintain robust ETL/ELT pipelines that move data from source systems into Google BigQuery, ensuring reliability, scalability, and observability at every stage.
  • Develop and enforce data models and schema standards using best-practice SQL and dimensional modelling principles, with a focus on clarity, reuse, and performance.
  • Own the Google BigQuery environment, optimising queries, managing costs, enforcing data governance, and ensuring the platform scales alongside the business.
  • Build and maintain Looker explores, LookML models, and dashboards that translate complex datasets into clear, actionable business intelligence for non-technical stakeholders.
  • Work across the full Google Cloud Platform stack, including Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, and Composer, to architect end-to-end data solutions.
  • Partner with analytics, engineering, and commercial teams to understand data requirements and translate business problems into scalable technical solutions.
  • Champion data quality and testing frameworks, implementing monitoring and alerting so that issues are caught early and resolved quickly.
  • Contribute to documentation, coding standards, and architectural decision records so the team can move fast with confidence.
  • Mentor junior data team members and set the bar for engineering rigour across the data function.
  • Stay current with developments in the modern data stack and proactively recommend tooling or process improvements where appropriate.
  • 5+ years of experience in SQL and data modelling, with strong command of dimensional modelling, star schemas, and performance optimisation.
  • 3+ years working with Google BigQuery in a production environment.
  • 3+ years hands-on experience with Google Cloud Platform (Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, Composer).
  • 3+ years building and maintaining ETL/ELT pipelines at scale.
  • 1+ year working with Looker and LookML to deliver business-facing dashboards and data products.
  • Demonstrable experience leading at least one data project end-to-end, from scoping through to delivery.
  • Able to communicate clearly with non-technical stakeholders about data limitations, timelines, and trade-offs.
  • Comfortable making pragmatic architecture decisions in a cloud-native, modern data stack environment.
  • Experience with dbt (Data Build Tool) for transformation layer management and testing.
  • Familiarity with orchestration tools such as Apache Airflow or Cloud Composer.
  • Python skills for pipeline scripting, data validation, or automation.
  • Background in retail, ecommerce, or fashion, understanding how data flows across commercial and digital channels.
  • Exposure to real-time or streaming data pipelines using Pub/Sub or Dataflow.
  • Experience with Terraform or Infrastructure-as-Code practices in a GCP context.
  • Familiarity with data governance frameworks, cataloguing, and lineage tracking.
  • Fixed Shifts: 12:00 PM - 9:30 PM IST (Summer) | 1:00 PM - 10:30 PM IST (Winter)
  • No Weekend Work: Real work-life balance, not just words
  • Day 1 Benefits: Laptop and full medical insurance provided
  • Support That Matters:Mentorship, community, and forums where ideas are shared
  • True Belonging: A long-term career where your contributions are valued
  • Location & Eligibility

    Where is the job
    India
    Remote within one country
    Who can apply
    IN
    Listed under
    India

    Listing Details

    Posted
    April 22, 2026
    First seen
    April 22, 2026
    Last seen
    May 5, 2026

    Posting Health

    Days active
    12
    Repost count
    0
    Trust Level
    37%
    Scored at
    May 5, 2026

    Signal breakdown

    freshnesssource trustcontent trustemployer trust
    Newsletter

    Stay ahead of the market

    Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

    A
    B
    C
    D
    Join 12,000+ marketers

    No spam. Unsubscribe at any time.

    S
    Senior Data Engineer (Remote, Full-Time) [AS233]