Contract: Senior Data Engineer
UpworkRemote-Latin AmericaPosted 11 February 2026
Job Description
<p>Upwork ($UPWK) is the world’s work marketplace. We serve everyone from one-person startups to over 30% of the Fortune 100 with a powerful, trust-driven platform that enables companies and talent to work together in new ways that unlock their potential.&nbsp;&nbsp;</p>
<p>Last year, more than $3.8 billion of work was done through Upwork by skilled professionals who are gaining more control by finding work they are passionate about and innovating their careers.&nbsp;</p>
<p>This is an engagement through Upwork’s Hybrid Workforce Solutions (HWS) Team. Our Hybrid Workforce Solutions Team is a global group of professionals that support Upwork’s business. Our HWS team members are located all over the world.</p>
<hr>
<p>This hybrid engagement will help build and operate Data Platform as a Service capabilities for internal teams. The role focuses on enabling scalable, secure, reliable, and well-governed data products through platform engineering practices—CI/CD for data, data mesh enablement, automation, observability, and self-service workflows. This engineer will partner closely with data engineering, analytics, and AI teams to improve platform reliability, developer experience, and time-to-delivery.</p>
<h3><strong>Work/Project Scope:</strong></h3>
<ul>
<li>Build and operate platform services that enable teams to deliver data products reliably (pipelines, transformations, orchestration, metadata, governance).</li>
<li>Design and implement CI/CD for data (tests, deployments, promotion workflows, rollback strategies, versioning).</li>
<li>Improve data platform reliability through observability, SLAs/SLOs, alerting, incident response, and runbooks.</li>
<li>Enable data mesh patterns: domain ownership, standardized interfaces, reusable templates, and paved paths.</li>
<li>Develop internal tooling and automation for onboarding datasets, creating standardized pipelines, and enforcing best practices (quality, security, lineage).</li>
<li>Implement or enhance data quality and validation frameworks (contract testing, reconciliation, anomaly detection).</li>
<li>Optimize platform performance and cost (warehouse optimization, job efficiency, resource scaling).</li>
<li>Collaborate with Security/Compliance to ensure encryption, access control, auditability, and least-privilege practices.</li>
<li>Partner with AI teams to ensure data products are fit for AI/ML workloads (feature readiness, dataset versioning, reproducibility, governance).</li>
<li>Improve and maintain Airflow orchestration, including DAG design, dependency management, and operational reliability for dbt and analytics workflows.</li>
</ul>
<h3><strong>Must Haves (Required Skills):</strong></h3>
<ul>
<li>Strong software engineering foundation building production systems (Python and/or Rust preferred; strong APIs/services mindset).</li>
<li>Proven experience in data platform engineering (not just building pipelines—building platforms for others).</li>
<li>Hands-on experience with CI/CD, Infrastructure as Code, and automation.</li>
<li>Experience with observability and reliability engineering (metrics, logs, tracing, SLOs, on-call readiness).</li>
<li>Strong knowledge of modern data ecosystem patterns (data modeling, orchestration, warehousing/lakehouse concepts).</li>
<li>Practical experience enabling data mesh or self-service platform capabilities.</li>
<li>Ability to work across ambiguity, drive delivery, and influence standards.</li>
</ul>
<h3><strong>Preferred Background/Experience:</strong></h3>
<ul>
<li>Experience with Snowflake and modern orchestration/testing patterns (dbt/SQLMesh-like workflows, St ... (truncated, view full listing at source)
Apply Now
Direct link to company career page