You will be part of the team responsible for measuring the quality of our Global Data Cloud. As a Data Quality Engineer, you will take a lead role executing data quality strategies, automating data quality processes and collaborating closely with cross-functional teams to ensure the accuracy, consistency, and reliability of our data assets. Your technical skillset will be instrumental in managing the team’s pipelines for Data Quality monitoring.
Execute a comprehensive data quality monitoring strategy which aligns with the organization's Data Quality Standards and business objectives.
Develop a strong understanding of Dun & Bradstreet’s inventory data.
Perform baseline data quality monitoring to proactively identify data quality issues.
Employ advanced data analysis and profiling techniques.
Liaise with business stakeholders to ensure requirements are clear and documented.
Automate data quality monitoring solutions and internal processes.
Create or update data models to ensure that data is stored in an organized structure.
Utilize PowerBI and/or Looker to design, create, connect, and administer dashboards which derive insights from data quality monitoring results.
Implement a robust data validation framework with automated testing processes.
Communicate with the globally distributed stakeholders using JIRA and Confluence.
Capture requirements accurately and seek strong understanding of use cases.
Recommend improvements to data quality team’s internal processes.
Generate regular reports on data quality metrics.
Review data to identify patterns or trends that may indicate errors in processing.
Develop comprehensive documentation of data quality processes, procedures, and findings, and ensure junior members document their work.
Comply with data governance policies and procedures.
Remain an expert in industry best practices and technologies related to data quality.
Provide guidance and mentorship to junior data quality engineers, fostering their growth and development.
Bachelor’s degree in Business Analytics, Computer Science, Information Technology, or a related field.
8+ years of experience and demonstrated in-depth knowledge of data analysis, querying languages, data modelling, and the software development life cycle.
Expertise in SQL (preferably BigQuery).Proficient in Python.
Familiar with Airflow, GCP Composer and Terraform.
Agile mindset and deep understanding of agile project management (Scrum/Kanban).
Experience in Database design, modelling, and best practices.
Experience with cloud computing technologies (preferably GCP).
Experience with Firebase Studio or other application development platforms.
Experience using AI tools such as Copilot Studio, Gemini Code Assist or Claude Code.
Ability to mentor & provide guidance to less experienced members of the team.
Analytical, process improvement and problem-solving skills.
Strong communication and the ability to articulate data issues and solutions.
Commitment to meet deadlines and uphold the release schedule.
Experience collaborating across time zones as part of a global team.
Experience with Microsoft Suite, including Excel, Word, Outlook and Teams.
Experience with DevOps best practices including CI/CD, automation, monitoring, observability, agile project management, version control, and continuous feedback.
Experience with data observability tools such as Acceldata or Informatica DQ.
Experience with XML and JSON data structures.
Understanding of ETL processes and their impact on data quality.
Knowledge of Machine Learning, specifically anomaly detection.
Experience developing agents and/or agentic systems.