P

Data QA Engineer III

Preferred Travel Group
24 hours ago
Full-time
On-site
United States
$100,000 - $130,000 USD yearly

About Us

At Preferred Travel Group, we care deeply about our people, nurture independence, and celebrate individuality.  Family values inspire us, and we believe that change creates opportunity. We are committed listeners and deliberate storytellers in hospitality. We engineer potential, foster trust, and co-create brighter futures. Our culture values collaboration, adaptability, and precision—qualities essential to every role. We are forever curious, guided by the Pineapple as our global symbol of hospitality. We believe the business of hospitality is borderless, and we proudly embrace that spirit every day.

We believe that every team member brings unique strengths to the table, and we’re committed to creating an environment where those strengths can thrive. 

________________________________________

Position Summary

We are seeking an experienced, analytical, detail‑oriented, and quality‑driven Data QA Engineer III. The Data QA Engineer will design and execute automated tests, implement data quality monitoring and alerting, and perform targeted manual verifications to ensure the accuracy, reliability, and trustworthiness of data engineering deliverables. This role focuses on validating data pipelines, transformations, data models, and BI/report outputs (not UI test automation) across ETL/ELT pipelines, reports, dashboards, and data management applications—so the business can confidently use data to drive decisions.  Under the general supervision of the Manager of QA, the QA Data Engineer works closely with developers in the Data Engineering team.

________________________________________

Key Responsibilities

Planning

  • Collaborate with the engineering team to refine work requests in an agile development system, translating requirements into testable data quality criteria.
  • Develop comprehensive test plans and test cases as part of project planning, including test strategy for validation, reconciliation, regression, and monitoring.
  • Provide accurate estimates of effort and duration of QA tasks.

Testing

  • Create and maintain automated tests to validate pipeline requirements, transformation logic, and downstream analytics/report outputs.
  • Write and optimize SQL queries for automated validations (such as row counts, uniqueness, referential integrity, reconciliation, business-rule checks, etc.)
  • Build regression suites for critical datasets and dashboards to ensure consistent numbers across releases and backfills.
  • Create and maintain deterministic test datasets (fixtures) and “golden” expected results for repeatable validation.
  • Assist with the verification and recreation of user-reported data issues, including data lineage/traceback from report to source.
  • File detailed and actionable defect reports, including reproduction steps, expected outcomes, and evidence (queries, sample records, screenshots of report values when relevant).
  • Work collaboratively with engineers to troubleshoot defects, validate fixes, and prevent recurrence via new tests and monitoring.
  • Continuously improve QA processes, frameworks, and tools for data testing and validation to align with best practices.

Integration & Monitoring

  • Integrate test automation with deployment automation, work tracking, and test tracking systems to enforce automated quality gates
  • Schedule and manage automated test runs (PR/CI, nightly, and post-deploy), ensuring consistent and reliable execution.
  • Implement data observability checks and alerting for freshness, volume, distribution/anomaly detection, and schema drift; tune alert thresholds to reduce noise.
  • Collect, consolidate, and analyze test and monitoring results to identify trends, systemic issues, and opportunities to improve data reliability.
  • Define and develop key performance indicators (KPIs) for measuring test effectiveness (coverage, escaped defects, time-to-detection, time-to-resolution).

Collaboration

  • Manage and prioritize work using the ticketing system while maintaining regular communication in stand-ups and stakeholder meetings.
  • Conduct code reviews of test code, SQL validation logic, and monitoring rules to ensure adherence to best practices and high-quality deliverables.
  • Partner with Data Engineering and BI stakeholders to validate semantic models and report logic (e.g., dataset/model measures, transformations, refresh behavior).
  • Contribute to technical documentation of processes, tools, workflows, and standards.

Leadership & Mentorship

  • Mentor other team members by sharing knowledge, conducting training sessions, and providing guidance on best practices for data testing and quality.
  • Take ownership of complex or high-impact initiatives (e.g., establishing regression strategy, monitoring standards), ensuring timely delivery and alignment with business objectives.

________________________________________

Required Experience/Qualifications

This role is classified as a senior‑level (Tier III) position and requires substantial prior relevant experience, in addition to:

  • Bachelor’s degree in Computer Science, Information Systems /other relevant degree or equivalent professional experience.
  • Demonstrated experience writing automated QA tests, especially for back-end systems without UI
  • Expert knowledge of relevant languages, such as SQL, Python, JavaScript, and/or C#
  • Expert knowledge of test automation tools such as Cypress, Great Expectations
  • Excellent analytical and problem-solving skills with a high level of attention to detail.
  • Strong communication and collaboration skills to work effectively with cross-functional teams.
  • Experience with CI/CD pipelines and version control systems (e.g., Jenkins, Git).
  • Knowledge of business intelligence tools such as Power BI or Tableau.
  • Understanding of Agile development processes.

________________________________________

Typical Behaviors & Working Style

The ideal candidate will demonstrate the following behavioral traits:

  • Versatile and adaptable, flexing to meet the needs of the situation
  • Maintains people-orientation, even if reserved in nature. Must be helpful and service-oriented, with a strong focus on repeatable, high-quality results.
  • Decision-making is collaborative, but meticulous, requiring consideration of facts, established procedures, and proven processes. Must rely on existing knowledge or training to help make decisions.
  • Communicates based on the task or technical needs at hand, defining clear team roles. Minimal collaboration is required, although they must prioritize specific tasks or problems.
  • Leads according to specialty or expertise. Will act with conviction to ensure quality standards, rarely delegating. When delegation is required, follow up will be close.

________________________________________

Preferred Working Environment & Job Characteristics

This role is best suited to someone who thrives in a:

  • Experienced data quality environment, contributing to the validation of pipelines, transformations, and analytics outputs with a strong sense of ownership and care.
  • Quality‑focused setting where accuracy, reliability, and trust in data are highly valued and directly support business decision‑making.
  • Dynamic and collaborative workflow, balancing planned QA initiatives, monitoring, and issue investigation alongside Data Engineering and BI partners.

The ideal candidate will find great satisfaction in:

  • Building and improving automated tests, monitoring, and alerting, helping ensure data issues are identified early and addressed effectively.
  • Exploring and resolving complex data challenges, working across pipelines, models, and reports to understand root causes and improve data quality.
  • Sharing knowledge and leading through expertise, supporting teammates, contributing to standards, and taking ownership of meaningful QA initiatives.

________________________________________

What success in this role looks like

  • Data pipelines, models, and analytics outputs are accurate, reliable, and widely trusted, enabling confident decision‑making across the business.
  • Data quality issues are identified and resolved proactively, through automated testing, monitoring, and efficient root‑cause analysis, with minimal disruption.
  • The data quality function grows stronger over time, through robust QA frameworks, clear documentation, knowledge‑sharing, and ownership of high‑impact initiatives.

________________________________________

Working Conditions

This is a work from home position. All technology required will be provided.

________________________________________

Training

  • Orientation via some live remote and some pre-recorded video sessions
  • IT security training
  • Internal development process and procedures
  • Company-approved AI technology

________________________________________

Disclaimer 

The above information is designed to indicate the general nature and level of work performed. It is not intended to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned to this job.

________________________________________

Salary

$100,000 - $130,000 actual compensation within this range will be determined by multiple factors including candidate experience and expertise.