Data Engineer

Location: Remote
Compensation: Piece Work
Reviewed: Wed, Mar 04, 2026
This job expires in: 29 days

Job Summary

A company is looking for a Data Engineer to support a post-merger integration between two tier-1 financial institutions.

Key Responsibilities
  • Design and implement low-latency streaming pipelines using Apache Flink and Kafka for banking transactions
  • Manage data extraction from legacy systems using Postgres WAL Change Data Capture (CDC) for data consistency
  • Build and optimize Lakehouse architectures using Databricks and Snowflake, ensuring compliance with financial regulations
Required Qualifications
  • Minimum 5 years of professional experience in Data Engineering, preferably in Financial Services or Fintech
  • Expert-level proficiency in Apache Flink, Kafka, and Databricks
  • Strong hands-on experience with Postgres (CDC) and Snowflake data warehousing
  • Deep understanding of Delta Lake and Lakehouse design principles
  • Proficiency in Python, Scala, or Java for streaming applications

COMPLETE JOB DESCRIPTION

The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...