Spark Engineer

Location: Remote
Compensation: To Be Discussed
Reviewed: Mon, Mar 16, 2026
This job expires in: 30 days

Job Summary

A company is looking for a Spark Engineer to join their team remotely.

Key Responsibilities:
  • Develop and optimize scalable analytics platforms using Apache Spark and related technologies
  • Design and implement ETL pipelines and manage data processing workflows
  • Collaborate with cross-functional teams to enhance performance and efficiency of data systems
Required Qualifications:
  • 3 to 5 years of real-time experience with Apache Spark, PySpark, and Scala/Python
  • Proficiency in Hadoop ecosystem components, including HDFS and YARN
  • Experience with cloud platforms such as AWS EMR or Azure Databricks
  • Familiarity with CI/CD pipelines and version control systems like Git
  • Ability to pass a coding test to demonstrate technical proficiency

COMPLETE JOB DESCRIPTION

The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...