We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Data Engineer

UST
United States, Pennsylvania, Coraopolis
345 Rouser Road (Show on map)
Jan 16, 2026
Role description

Who we are:

At UST, we help the world's best organizations grow and succeed through transformation. Bringing together the right talent, tools, and ideas, we work with our client to co-create lasting change. Together, with over 30,000 employees in 30+ countries, we build for boundless impact-touching billions of lives in the process. Visit us at .

Summary:

UST is looking for a Senior Data Engineer / Platform Engineer with strong expertise in GCP, Airflow, PySpark, and cloud-native data platforms to build, support, and optimize scalable data pipelines and infrastructure.

The Opportunity:

* Design and develop ETL pipelines for multiple data sources Automate and manage data workflows using Airflow, PySpark, and Dataproc on GCP
* Manage GCP resources including Dataproc clusters, serverless batches, Vertex AI, and GCS
* Provide platform and pipeline support; troubleshoot Spark, BigQuery, and Airflow issues
* Collaborate with data scientists, analysts, and internal stakeholders
* Optimize data platforms for performance, reliability, and cost efficiency
* Perform root cause analysis and implement preventive solutions
* Design, deploy, and maintain CI/CD pipelines using GCP and Airflow
* Set up, monitor, and maintain GCP infrastructure with on-call support
* Plan and execute data migrations and resolve performance bottlenecks
* Administer containerized workloads using Docker, Kubernetes, and Helm



What you need:

* Strong experience in cloud-based data engineering and platform support
* Ability to independently manage data pipelines, infrastructure, and production issues
* Experience with DevOps, SRE practices, and cloud security standards



Required Skills:

* Programming: Python, SQL
* GCP: BigQuery, Dataproc, Vertex AI, Pub/Sub, Cloud Functions, GCS
* Data Engineering: Apache Airflow, PySpark, SparkSQL, data modeling
* DevOps: CI/CD pipelines, Linux/UNIX administration
* Containers: Docker, Kubernetes, Helm
* Monitoring: Prometheus, Grafana, Splunk



Desired Skills:

* Experience with AWS or Azure in addition to GCP
* Exposure to Agentic AI concepts
* Background in DevOps or Site Reliability Engineering roles

Qualification:

* Bachelor's degree in Engineering, Computer Science, or equivalent

What we believe:

We're proud to embrace the same values that have shaped UST since the beginning. Since day one, we've been building enduring relationships and a culture of integrity. And today, it's those same values that are inspiring us to encourage innovation from everyone, to champion diversity and inclusion and to place people at the centre of everything we do.

Humility:

We will listen, learn, be empathetic and help selflessly in our interactions with everyone.

Humanity:

Through business, we will better the lives of those less fortunate than ourselves.

Integrity:

We honour our commitments and act with responsibility in all our relationships.

Equal Employment Opportunity Statement

UST is an Equal Opportunity Employer. We believe that no one should be discriminated against because of their differences, such as age, disability, ethnicity, gender, gender identity and expression, religion, or sexual orientation.

All employment decisions shall be made without regard to age, race, creed, colour, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.


UST
reserves the right to periodically redefine your roles and responsibilities based on the requirements of the organization and/or your performance.

* To support and promote the values of UST.

* Comply with all Company policies and procedures


Skills

pyspark,gcp,airflow,dataproc,etl,bigquery,sparksql,data modeling,devops

Applied = 0

(web-df9ddb7dc-hhjqk)