Senior Data Collection Engineer

Location: Remote
Compensation: To Be Discussed
Reviewed: Fri, May 09, 2025
This job expires in: 18 days
Scrapy CI/CD Git AWS

Job Summary

A company is looking for a Senior Data Collection Engineer.

Key Responsibilities
  • Design and build robust web crawlers for high-scale data extraction using Scrapy
  • Enhance and maintain infrastructure for automated testing, deployment, and monitoring of spiders
  • Ensure data integrity and accuracy through robust validation mechanisms and collaboration with internal teams
Required Qualifications
  • Experience with Git workflows, code reviews, and CI/CD pipelines
  • Familiarity with cloud infrastructure, preferably AWS
  • Knowledge of web environment standards and networking protocols
  • Proficient in developing scalable web crawlers and data pipelines using Python and Scrapy
  • Prior experience mentoring or leading junior developers is a plus
FREE TOOLS
Unlock Expert Career Tools

Register free for worksheets, guides, and on-demand coaching to support your job search.

COMPLETE JOB DESCRIPTION

The job description is available to subscribers. Subscribe today to get the full benefits of a premium membership with Virtual Vocations. We offer the largest remote database online...