Principal GCP Data Engineer

Job is Expired
Location: Remote
Compensation: To Be Discussed
Reviewed: Wed, Jun 18, 2025

Job Summary

A company is looking for a Principal GCP Data Engineer to support data initiatives and build data pipelines.

Key Responsibilities
  • Develop and build data ingestion and ETL pipelines from scratch using SnapLogic, Python, SQL, Dataflow, and Spark
  • Migrate data to GCP and build out the GCP BigQuery warehouse while sunsetting legacy ETL processes
  • Support data modeling and orchestration, focusing on the development of new DAGs using Airflow or Cloud Composer
Required Qualifications
  • 5-6 years of experience in data engineering with a focus on building data pipelines
  • Experience with GCP technologies, particularly BigQuery and SnapLogic
  • Proficiency in orchestration tools such as Airflow or Cloud Composer
  • Familiarity with data warehousing fundamentals and data modeling
  • Experience with additional technologies such as Kafka, Java, Apache Beam, or Alteryx is a plus

COMPLETE JOB DESCRIPTION

The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...