Filters Applied
Clear All- Permanent (219)
- Temporary (1)
- Independent Contractor (24)
Information Technology / Permanent Remote Python Data Jobs
Senior DataOps Engineer
Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, IT, or related field Experience with AWS services and big data tools Hands-on experience with Infrastructure as Code, Kubernetes, and CI/CD tools Strong SQL coding and data...modeling skills Python programming skills
Senior Software Engineer
technical organization with impactful work Experience with Kubernetes APIs and frameworks, not just cluster operations BS in Computer Science, Engineering, Physics, Mathematics, or equivalent experience Proficiency in a systems programming language (Go, Python...) and understanding of data structures and algorithms Experience with large-scale production systems and software engineering principles
Senior Software Engineer
large-scale production systems Experience with bare metal hardware APIs and frameworks, preferably on GPU servers Possess a BS in Computer Science, Engineering, Physics, Mathematics, or a comparable degree Proficient in a systems programming language (Go, Python...) and understanding of data structures and algorithms Demonstrated impact from previous work in a highly technical organization
- FREE TOOLSUnlock Expert Career Tools
Register free for worksheets, guides, and on-demand coaching to support your job search.
Data Scientist
of machine learning models and translate complex findings for non-technical stakeholders Required Qualifications Bachelor's degree in a quantitative field such as statistics, mathematics, or computer science 1-4 years of related experience in a data...role Strong skills in Python for data manipulation and modeling Expertise in SQL for querying and building data pipelines Experience with data visualization tools such as Power BI, Looker, or Tableau
Senior Data Engineer
Key Responsibilities Design, build, and maintain ETL pipelines to extract data from Hubspot and process it in AWS Work with large datasets using Python and raw SQL, ensuring data quality and performance Collaborate with teams to ensure data availability
Data Architect
equivalent experience 7+ years of experience in data warehousing and engineering 3+ years of experience in Data Modeling and ETL tools Hands-on experience with modern data platforms like Snowflake or AWS Redshift 3+ years of experience utilizing Python...for data engineering solutions
Mid-Level Data Engineer
performance Collaborate with cross-functional teams to deliver datasets for financial models and product launches Required Qualifications 3-6 years in a data engineering role, with experience in DeFi, fintech, or a related field Extensive experience with Python...and SQL Experience with data warehousing solutions such as Snowflake, BigQuery, or Redshift Strong understanding of Google Cloud Platform and data governance standards Hands-on experience with blockchain or crypto data tools
Associate AI Engineer
degree in Computer Science or a related field 1-3 years of experience in AI, software engineering, MLOps, DevOps, or data engineering Familiarity with Agile methodologies and project management tools like Jira Experience with Azure DevOps, CI/CD, and Python...Knowledge of Big Data technologies and open-source AI/ML frameworks
Data Engineer
Key Responsibilities Build and maintain robust ETL pipelines from scratch Work with large, complex datasets using Python and raw SQL Deliver data solutions that drive business outcomes for enterprise clients Required Qualifications 3+ years of...experience as a Data Engineer Strong Python and SQL skills Hands-on experience with PySpark, AWS, and Databricks Proven experience building ETL pipelines
Data Engineer
Key Responsibilities Build and maintain robust ETL pipelines from scratch Work with large, complex datasets using Python and raw SQL Deliver data solutions that drive business outcomes for enterprise clients Required Qualifications 7+ years of...experience as a Data Engineer Strong Python and SQL skills Hands-on experience with PySpark, AWS, and Databricks Proven experience building ETL pipelines