GCP Data Engineer

Location: Remote
Compensation: To Be Discussed
Reviewed: Fri, Jan 16, 2026
This job expires in: 30 days

Job Summary

A company is looking for a GCP Data Engineer (Snowflake, Airflow, Agent Development) - Remote.

Key Responsibilities
  • Develop an understanding of the data environment through profiling and analysis to enhance data quality
  • Build Python-based solutions for data extraction, cleansing, transformation, and validation to support data migration
  • Document data integration processes, ensure traceability, and develop data monitoring solutions
Required Qualifications
  • Bachelor's Degree in Computer Science or equivalent experience preferred
  • 4 years of experience in software engineering with a strong focus on Python and data engineering
  • 3 years of development experience and proficiency with Relational Databases, NoSQL, and/or Data Lakehouses
  • Intermediate proficiency in cloud technologies, including Google Cloud Platform, required
  • Experience with workflow orchestration tools such as Airflow, Dagster, and Prefect

COMPLETE JOB DESCRIPTION

The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...