Permanent Remote Python Data Developer Jobs

Sort by: Date | Relevance
  • Senior Python Developer

    A company is looking for a Senior Python Developer to develop and maintain infrastructure and data processing services....Key Responsibilities Develop and maintain Python-based infrastructure and data processing services Collaborate with C++ developers to ensure seamless integration and optimal performance Required Qualifications Strong proficiency in Python with experience

    Python C++ Apache Kafka SQL
  • Data Visualization Developer

    A company is looking for a Data Visualization Developer II....gather requirements and iterate on reporting solutions Required Qualifications Bachelor's degree in Computer Science, Computer Information Systems, Business Management, or a related field or equivalent relevant experience 2-4 years of experience in data

    Power BI DAX Power Query Data Modeling
  • ETL Data Integration Developer

    A company is looking for a Talend / ETL Data Integration Developer II to work remotely.

    Talend ETL Microsoft SQL Java
  • GET ACCESS
    Access New Remote Job Listings Now

    Create a free account to begin your remote job search with our expert-vetted listings, resume tips, and career tools.

  • Senior Master Data Management Developer

    A company is looking for a Senior Master Data Management Developer to provide full life cycle support for data resources development and enhancement.

    Informatica MDM IDD Configuration Oracle DB2 AS400
  • Senior Data Engineer

    infrastructure Required Qualifications Minimum 7 years of data engineering experience, with a strong background in pipelines, ingest, and ETL/ELT processes Minimum 5 years of cloud experience across multiple vendors (AWS, Azure, GCP) Advanced competency in Python...for API, web, and data development Expert-level knowledge of databases, including Oracle, Postgres, and MySQL Deep expertise in AWS cloud services and architecture

    Data Engineering Data Ingestion ETL ELT
  • Senior Software Engineer

    Key Responsibilities Translate requirements into reliable, scalable microservices using languages such as Go, Python, Java, or C# Develop data-driven services utilizing data warehousing, big data, analytics, and machine learning Participate in Agile

    Go Python Java C#
  • Principal Market Analyst

    sets to develop models and insights regarding market trends and collaborate with various teams for revenue forecasts Required Qualifications Bachelor's degree in engineering, economics, or related field; advanced degree preferred 7+ years of experience...in optimization, energy markets, quantitative analysis, and/or BESS analysis Proficiency in Python and SQL, with strong data analysis techniques Strong knowledge of energy markets and locational marginal price formation Demonstrated ability to manage

    Back-cast Analyses Market Analysis Bidding Behaviors Data-driven Models
  • Principal Software Engineer

    Key Responsibilities Lead the Data Architecture domain and influence enterprise-wide data strategies Collaborate with cross-functional teams to develop and deliver data platforms and services Design and implement data models and pipelines optimized...for AI/ML use cases Required Qualifications 12+ years of experience in software development, data engineering, and systems development 8+ years of experience in data architecture and data modeling Degree in engineering, computer science, or a related

    Data Architecture Data Modeling Data Pipelines Semantic Layer
  • Senior Data Engineer

    Key Responsibilities Develop and enhance reports and data interfaces using various programming languages and tools Convert SAS code to efficient Python code and assist with SAS migration and administration Design and develop ETL jobs and automate...Experience with ETL development and data warehousing Proficiency in Python, SAS, Teradata, Informatica, and SQL programming Familiarity with UNIX/Linux shell scripting and job automation tools

    SAS Python Teradata SQL Power BI
  • Staff Data Engineer

    Qualifications At least 7 years of Company engineering experience Experience building and optimizing Company pipelines using Trino, Spark, and dbt Experience managing Company infrastructure in public clouds Fluency in SQL and excellent coding ability in Java, Python..., or Scala Knowledge of Company modeling techniques suitable for modern Company lakes

    Data Engineering Data Lake Trino Spark