Cloud Engineer / Remote / Full time hire Fully Remote - US

Cloud Engineer / Remote / Full time hire

Full Time • Fully Remote - US
Hiring on behalf of a client for the role of

Cloud Engineer
We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.

Key Responsibilities:

• Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow). • Analyze and map existing Teradata workloads to appropriate GCP equivalents. • Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery). • Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance. • Develop automated workflows for data movement and transformation using GCP native tools and/or custom scripts (Python). • Optimize data storage, query performance, and costs in the cloud environment. • Implement monitoring, logging, and alerting for all migration pipelines and production workloads.

Required Skills:

• 4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP. • Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL. • Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc. • Experience with ETL/ELT pipelines using custom scripting tools (Python/Java). • Proven ability to refactor and translate legacy logic from Teradata to GCP. • Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments. • Strong analytical, troubleshooting, and communication skills.

Preferred Qualifications:

• GCP certification (Preferred: Professional Data Engineer). • Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP. • Experience working in the healthcare domain. • Knowledge of data governance, security, and compliance in cloud ecosystems. Behavioral Skills: • Problem solving mindset • Attention to detail • Accountability and ownership • Curious and staying current with evolving GCP services

This is a remote position.

Compensation: $90,000.00 per year




(if you already have a resume on Indeed)

Or apply here.

* required fields

Location
Or
Or