Hadoop Developer
Location: Remote
Compensation: Salary
Reviewed: Sun, May 17, 2026
This job expires in: 30 days
Job Summary
Hadoop Developer, seeking an experienced professional to design and operate large-scale data processing pipelines in a full-time, remote position.
Key Responsibilities
- Design, develop, and operate end-to-end big-data pipelines on Hadoop
- Build robust ETL/ELT workflows and high-throughput streaming data pipelines
- Implement data governance, monitoring, and logging strategies for big-data pipelines
Required Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related technical discipline
- Five or more years of experience designing and operating big-data pipelines on Hadoop
- Strong expertise with Apache Spark and the broader Hadoop ecosystem
- Hands-on experience with streaming data platforms like Kafka or Spark Streaming
- Solid understanding of distributed systems concepts and strong scripting skills in Python or Shell
COMPLETE JOB DESCRIPTION
The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...