Job Summary
A technology company that specializes in cybersecurity has an open position for a Remote Senior Big Data Engineer.
Core Responsibilities of this position include:
- Writing jobs using PySpark to process billions of events per day
- Fine tuning existing Hadoop / Spark clusters
- Rewriting some existing PIG jobs in PySpark
Must meet the following requirements for consideration:
- BS degree in Computer Science or related field
- 7+ years of relevant work experience
- Experience in building data pipelines at scale
- Good knowledge of Hadoop / Spark /Apache Kafka, Python, AWS, PySpark, etc
- Good programming skills – Python
- Operation experience in the tuning of clusters for optimal data processing