Job Summary
A blogging and content syndication platform has a current position open for a Telecommute Senior Data Engineer.
Must be able to:
- Work on high impact projects that improve data availability and quality, and provide reliable access to data
- Design, architect and support new and existing data and ETL pipelines and recommend improvements and modifications
- Create optimal data pipeline architecture and systems
Skills and Requirements Include:
- 5 years of experience implementing complex ETL pipelines preferably in connection with Hadoop or Spark
- Experience writing complex SQL and ETL processes
- Exceptional coding and design skills, particularly in Java/Scala and Python
- Worked with large data volumes, including processing, transforming and transporting large-scale data
- Hands-on experience with AWS and services like EC2, SQS, SNS, RDS, Cache etc.
- Have a strong understanding and usage of algorithms and data structures