Hadoop Developer

Location: Remote
Compensation: Salary
Reviewed: Sat, May 16, 2026
This job expires in: 30 days

Job Summary

Hadoop Developer, a full-time remote position requiring over five years of experience, responsible for designing and operating large-scale data processing pipelines and analytics platforms on Hadoop.

Key Responsibilities
  • Design, develop, and operate end-to-end big-data pipelines on Hadoop, ingesting data from various sources
  • Build robust ETL/ELT workflows using Apache Spark, Hive, Pig, and Sqoop, ensuring data quality and error handling
  • Develop high-throughput streaming data pipelines and optimize Spark and MapReduce jobs for performance and cost-effectiveness
Required Qualifications
  • Bachelor's degree in Computer Science, Engineering, or a related technical discipline
  • Five or more years of professional experience designing and operating big-data pipelines on Hadoop
  • Strong hands-on expertise with Apache Spark (Scala, Python, or Java) in production environments
  • Solid experience with Hive, HDFS, Sqoop, HBase, and the broader Hadoop ecosystem
  • Hands-on experience with streaming data platforms such as Kafka, Spark Streaming, or Flink

COMPLETE JOB DESCRIPTION

The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...