Spark Engineer
Location: Remote
Compensation: To Be Discussed
Reviewed: Mon, Mar 16, 2026
This job expires in: 30 days
Job Summary
A company is looking for a Spark Engineer to join their team remotely.
Key Responsibilities
- Develop and optimize scalable analytics platforms using Apache Spark and related technologies
- Build and maintain ETL pipelines and manage data processing workflows
- Collaborate with cross-functional teams to enhance data solutions and performance tuning
Required Qualifications
- 3 to 5 years of real-time experience with Apache Spark, PySpark, and Spark SQL
- Proficient in Scala or Python and familiar with the Hadoop ecosystem
- Experience with cloud platforms such as AWS EMR or Azure Databricks
- Knowledge of CI/CD pipelines and version control using Git
- Ability to pass a coding test as part of the application process
COMPLETE JOB DESCRIPTION
The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...