Maya Rivera Senior SDET
Austin, TX • maya.rivera@gmail.com • +1 (512) 555-0142
Profile Summary
- Senior SDET / QA Automation Engineer with 7+ years of experience designing test frameworks across SaaS, fintech, and developer tooling, specialized in shift-left automation, performance engineering, and quality signals at scale.
- Comfortable across the testing stack with UI automation (Playwright, Cypress, Selenium), API testing (REST Assured, Karate, Postman/Newman), performance testing (k6, JMeter, Gatling), contract testing (Pact, WireMock), and CI/CD orchestration (GitHub Actions, Jenkins, Docker, Testcontainers).
- Strong grounding in test architecture, contract-first integration, page-object plus fixture patterns, and risk-based prioritization. Comfortable applying shift-left automation and test pyramid and risk-based prioritization to drive shorter feedback loops, lower flakiness, and stronger release confidence in quality engineering at scale.
- Comfortable working cross-functionally with backend, frontend, DevOps, and product teams in a fully agile environment, contributing to release planning, on-call coverage, and post-mortems.
- Bias toward reliable test signals over volume, mentor to junior SDETs and developers learning to write better tests, and frequent contributor to internal testing playbooks and quality engineering forums.
Technical Skills
- Languages:
- Java, TypeScript, Python, Kotlin, SQL, Bash, Groovy
- UI & E2E Automation:
- Playwright, Cypress, Selenium WebDriver, WebdriverIO, Appium, Espresso, XCUITest, Percy
- API & Contract Testing:
- REST Assured, Karate, Postman/Newman, Pact, Spring Cloud Contract, gRPC testing, schema validation
- Performance & Load Testing:
- k6, Gatling, JMeter, Locust, Artillery, soak and stress profiles, SLO baselining
- CI/CD & Containers:
- GitHub Actions, Jenkins, GitLab CI, CircleCI, Docker, Testcontainers, Kubernetes, parallel sharding
- Mocking & Service Virtualization:
- WireMock, MockServer, Mountebank, Hoverfly, stub-runner, golden-file replay
- Test Data & Environments:
- Faker + factory builders, ephemeral Kubernetes namespaces, anonymized snapshots, seed scripts, on-demand previews
- Quality Engineering Method:
- shift-left automation, test pyramid and risk-based prioritization, contract-first integration, flaky-test triage, escape-rate analysis, Datadog and Grafana dashboards
Education
Work Experience
- Owned the payment platform test automation framework supporting 150+ microservices and mobile and web checkout flows, leading end-to-end design and operation across framework architecture, test orchestration, and quality signals within a Kubernetes-based deployment context.
- Architected a Java + TypeScript test framework on top of Playwright, REST Assured, and Testcontainers, applying parallel sharding and container-per-test isolation to bring full-pipeline regression from 38 minutes to ~12 minutes across 210 services.
- Drove end-to-end UI coverage for the checkout, dashboard, and merchant onboarding flows with Playwright and visual diffing via Percy, applying page-object plus fixtures pattern to bring flaky-test rate from 9.2% to ~1.4% over two quarters.
- Built API-layer test suites with REST Assured and Karate against 150+ gRPC and REST endpoints, using schema validation, contract assertions, and golden-file replay to catch ~78% of backend regressions before they reached integration.
- Designed load, stress, and soak tests with k6 and Gatling for the authorization API and webhooks pipeline, baselining against 2,500 RPS and p99 < 180 ms SLOs, and pulling out four scaling bottlenecks during pre-launch hardening.
- Stood up service virtualization with WireMock and contract testing via Pact between the payments service and 6 downstream consumers, decoupling release cycles and cutting cross-team blocker tickets by ~52%.
- Built quality dashboards in Datadog and Grafana tracking flakiness, escape rate, MTTR-to-test-fix, and coverage, partnering with 8 squads to drive flaky tests down 65% and lift escape rate from 3.4% to 0.9% in three quarters.
- Defined the test strategy across the Jira mobile team's unit, integration, contract, and E2E layers, shifting ~60% of test investment into the integration and contract tiers and shortening release-candidate validation from 5 days to ~2 days.
- Built E2E coverage for Android and iOS Jira mobile with Appium and WebdriverIO, applying device-farm parallelization on BrowserStack to take regression runtime from 2 hours to ~35 minutes across 48 device-OS combinations.
- Wrote API contract tests with Pact and Postman/Newman against 40+ Atlassian Cloud APIs, applying provider verification in CI to catch breaking changes before they shipped, eliminating ~7 hotfix releases per quarter.
- Triaged and root-caused 600+ flaky-test failures across the mobile suite, partnering with feature teams to drive the flaky rate from 12% to ~3% and embedding a shift-left review checklist into PR templates.