airflow Remote Jobs

102 Results

+30d

Senior Data Engineer

DevoteamTunis, Tunisia, Remote
airflowsqlscrum

Devoteam is hiring a Remote Senior Data Engineer

Description du poste

Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

Votre rôle consistera à contribuer à des projets data en apportant son expertise sur les tâches suivantes :

  • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur GCP, en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
  • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
  • Optimiser les performances des requêtes SQL et des processus ETL pour garantir des temps de réponse rapides et une scalabilité.
  • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
  • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
  • Restez à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

Qualifications

  • Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.
  • Au moins 3 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
  • Certification GCP (Google Cloud Platform) est un plus.
  • Très bonne communication écrite et orale (livrables et reportings de qualité)

See more jobs at Devoteam

Apply for this job

+30d

Machine Learning Engineer (All Genders)

DailymotionParis, France, Remote
airflowDesigndockerpython

Dailymotion is hiring a Remote Machine Learning Engineer (All Genders)

Job Description

Joining the Dailymotion data team means taking part in the creation of our unique algorithms, designed to bring more diversity and nuance to online conversations.

Our Machine Learning team, established in 2016, has been actively involved in developing models across a diverse range of topics. Primarily, we focus on recommender systems, and extend our expertise to content classification, moderation, and search functionalities.

You will be joining a seasoned and diverse team of Senior Machine Learning Engineers, who possess the capability to independently conceptualize, deploy, A/B test, and monitor their models.

We collaborate closely with the Data Product Team, aligning our efforts to make impactful, data-driven decisions for our users.

Learn more about our ongoing projects:https://medium.com/dailymotion

As a Machine Learning Engineer, you will:

  • Design and deploy scalable recommender systems, handling billions of user interactions and hundreds of millions of videos.
  • Contribute to various projects spanning machine learning domains, encompassing content classification, moderation, and ad-tech.
  • Foster autonomy, taking ownership of your scope, and actively contribute ideas and solutions. Maintain and monitor your models in production.
  • Collaborate with cross-functional teams throughout the entire machine learning model development cycle:
    • Define success metrics in collaboration with stakeholders.
    • Engage in data collection and hypothesis selection with the support of the Data Analysts Team.
    • Conduct machine learning experiments, including feature engineering, model selection, offline validation, and A/B Testing.
    • Manage deployment, orchestration, and maintenance on cloud platforms with the Data Engineering Team.

Qualifications

  • Master's degree/PhD in Machine Learning, Computer Science, or a related quantitative field.
  • At least 1 year of professional experience working with machine learning models at scale (experience with recommender systems is a plus).
  • Proficient in machine learning concepts, with the ability to articulate theoretical concepts effectively.
  • Strong coding skills in Python & SQL.
  • Experience in building production ML systems; familiarity with technologies such as GitHub, Docker, Airflow, or equivalent services provided by GCP/AWS/Azure.
  • Experience with distributed frameworks is advantageous (Dataflow, Spark, etc.).
  • Strong business acumen and excellent communication skills in both English and French (fluent proficiency).
  • Demonstrated aptitude for autonomy and proactivity is highly valued.

See more jobs at Dailymotion

Apply for this job

+30d

Staff Site Reliability Engineer

MozillaRemote US
6 years of experienceterraformairflowsqlDesignansibleazurejavac++openstackdockerelasticsearchkubernetesjenkinspythonAWSbackendNode.js

Mozilla is hiring a Remote Staff Site Reliability Engineer


Why Mozilla?

Mozilla Corporation is the non-profit-backed technology company that has shaped the internet for the better over the last 25 years. We make pioneering brands like Firefox, the privacy-minded web browser, and Pocket, a service for keeping up with the best content online. Now, with more than225million people around the world using our products each month, we’re shaping the next 25 years of technology. Our work focuses on diverse areas including AI, social media, security and more. And we’re doing this while never losing our focus on our core mission – to make the internet better for everyone. 

The Mozilla Corporation is wholly owned by the non-profit 501(c) Mozilla Foundation. This means we aren’t beholden to any shareholders — only to our mission. Along with thousands of volunteer contributors and collaborators all over the world, Mozillians design, build and distributeopen-sourcesoftware that enables people to enjoy the internet on their terms. 

About this team and role:

Mozilla’s Release SRE Team is looking for a Staff SRE to help us build and maintain infrastructure that supports Mozilla products. You will combine skills from DevOps/SRE, systems administration, and software development to influence product architecture and evolution by crafting reliable cloud-based infrastructure for internal and external services.

As a Staff SRE you will work closely with Mozilla’s engineering and product teams and participate in significant engineering projects across the company. You will collaborate with hardworking engineers across different levels of experience and backgrounds. Most of your work will involve improving existing systems, building new infrastructure, evaluating tools and eliminating toil.

What you’ll do:

  • Manage infrastructure in AWS and GCP
  • Write, maintain, and expand automation scripts, metrics and monitoring tooling, and orchestration recipes
  • Lead otherSREs and software development teams to deliver products with an eye on reliability and automation
  • Demonstrate accountability in the delivery of work
  • Spot and raise potential issues to the team
  • Be on-call for production services and infrastructure
  • Be trusted to resolve unclear but urgent tasks
What you’ll bring:
  • Degree and 6 years of experience related to either backend software development or cloud operations or experience related DevOps/SRE
  • Experience programming in at least one of the following languages: Python, Java, C/C++, Go, Node.js or Rust. 
  • Involvement in running services in the cloud
  • Kubernetes administration and optimization
  • Proven understanding of database systems (SQL and/or non-relational databases)
  • Infrastructure As Code and Configuration as Code tooling (Puppet, Chef, Ansible, Salt, Terraform, Amazon Cloudformation or Google Cloud Deployment Manager)
  • Strong communication skills
  • Curiosity and interest in learning new things
  • Commitment to our values:
    • Welcoming differences
    • Being relationship-minded
    • Practicing responsible participation
    • Having grit
Bonus points for…
  • CI/CD orchestration (Jenkins, CircleCI, or TravisCI)
  • ETL, data modeling, cloud-based data storage, processing
  • GCP Data Services (Dataflow, BigQuery, Dataproc)
  • Workflow and data pipeline orchestration (Airflow, Oozie, Jenkins, etc)
  • Container orchestration technologies (Kubernetes, OpenStack, Docker swarm, etc)
  • Open source software involvement
  • Monitoring/Logging with technologies like Splunk, ElasticSearch, Logstash/Fluentd, Stackdriver, Time-series databases like InfluxDB etc.

What you’ll get:

  • Generous performance-based bonus plans to all regular employees - we share in our success as one team
  • Rich medical, dental, and vision coverage
  • Generous retirement contributions with 100% immediate vesting (regardless of whether you contribute)
  • Quarterly all-company wellness days where everyone takes a pause together
  • Country specific holidays plus a day off for your birthday
  • One-time home office stipend
  • Annual professional development budget
  • Quarterly well-being stipend
  • Considerable paid parental leave
  • Employee referral bonus program
  • Other benefits (life/AD&D, disability, EAP, etc. - varies by country)

About Mozilla 

Mozilla exists to build the Internet as a public resource accessible to all because we believe that open and free is better than closed and controlled. When you work at Mozilla, you give yourself a chance to make a difference in the lives of Web users everywhere. And you give us a chance to make a difference in your life every single day. Join us to work on the Web as the platform and help create more opportunity and innovation for everyone online.

Commitment to diversity, equity, inclusion, and belonging

Mozilla understands that valuing diverse creative practices and forms of knowledge are crucial to and enrich the company’s core mission.  We encourage applications from everyone, including members of all equity-seeking communities, such as (but certainly not limited to) women, racialized and Indigenous persons, persons with disabilities, persons of all sexual orientations,gender identities, and expressions.

We will ensure that qualified individuals with disabilities are provided reasonable accommodations to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment, as appropriate. Please contact us at hiringaccommodation@mozilla.com to request accommodation.

We are an equal opportunity employer. We do not discriminate on the basis of race (including hairstyle and texture), religion (including religious grooming and dress practices), gender, gender identity, gender expression, color, national origin, pregnancy, ancestry, domestic partner status, disability, sexual orientation, age, genetic predisposition, medical condition, marital status, citizenship status, military or veteran status, or any other basis covered by applicable laws.  Mozilla will not tolerate discrimination or harassment based on any of these characteristics or any other unlawful behavior, conduct, or purpose.

Group: C

#LI-REMOTE

Req ID: R2515

Hiring Ranges:

US Tier 1 Locations
$163,000$239,000 USD
US Tier 2 Locations
$150,000$220,000 USD
US Tier 3 Locations
$138,000$203,000 USD

See more jobs at Mozilla

Apply for this job

+30d

Data Manager

remote-firsttableauairflowsqlpython

Parsley Health is hiring a Remote Data Manager

About us:

Parsley Health is a digital health company with a mission to transform the health of everyone, everywhere with the world's best possible medicine. Today, Parsley Health is the nation's largest health care company helping people suffering from chronic conditions find relief with root cause resolution medicine. Our work is inspired by our members’ journeys and our actions are focused on impact and results.

The opportunity:

We’re hiring an experienced Manager of Data to drive the data strategy for Parsley Health: by championing quality data across the organization and leading functions for data science and analytics along with data engineering.

This person should have knowledge of the healthcare space, specifically related to health outcomes and benchmarks and will report into the Chief Technology Officer.

What you’ll do:

  • Passionate about our mission to live healthier through revolutionary primary care, excited for the future of healthcare, and a personal belief in wellness.
  • Collaborate on strategic direction with the leadership team and executives to evolve our mid and long term roadmap
  • Hands-on manager who will write code and has experience in a variety of different systems and architecture, analysis, and presentation. 
  • Support identifying clinical outcomes and publishing papers with the clinical team and SVP of Clinical Operations.
  • Empower high quality product decisions through data analysis.
  • Develop machine learning models to better assist our members’ health care needs.
  • Foster a strong culture of data-driven decision making through training and mentorship within your team and across the company.
  • Implement and maintain a world-class data stack that empowers data consumers with reliable, accessible, compliant insights.
  • Consult with data consumers to improve their measurement strategies.
  • Manage a team of two members and grow it to a multi disciplinary function within a few years. 

What you’ll need:

  • Experience in building a data strategy for a small team or company. Potentially previously the first data hire at a company (not required). 
  • Proficient in statistical methods.
  • Loves to deep dive into problems and solutioning to identify root causes and be able to extrapolate a big picture strategy or story. 
  • Helps people with their careers while creating and improving upon structures to enable career growth 
  • Sets up processes and governance around project management, data quality, prioritization, etc.
  • Well versed in SQL, at least one scripting language (R, Python, etc.), Excel, and BI platforms (Looker, Tableau, etc.).

Tech stack

  • Python
  • GCP
  • Airflow
  • SQL
  • Looker
  • Dataform (dbt)

Benefits and Compensation:

  • Equity Stake
  • 401(k) + Employer Matching program
  • Remote-first with the option to work from one of our centers in NYC or LA 
  • Complimentary Parsley Health Complete Care membership
  • Subsidized Medical, Dental, and Vision insurance plan options
  • Generous 4+ weeks of paid time off
  • Annual professional development stipend

Parsley Health is committed to providing an equitable, fair and transparent compensation program for all employees.

The starting salary for this role is between $165,750 - $195,000, depending on skills and experience. We take a geo-neutral approach to compensation within the US, meaning that we pay based on job function and level, not location.

Individual compensation decisions are based on a number of factors, including experience level, skillset, and balancing internal equity relative to peers at the company. We expect the majority of the candidates who are offered roles at our company to fall healthily throughout the range based on these factors. We recognize that the person we hire may be less experienced (or more senior) than this job description as posted. If that ends up being the case, the updated salary range will be communicated with candidates during the process.


At Parsley Health we believe in celebrating everything that makes us human and are proud to be an equal opportunity workplace. We embrace diversity and are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We believe that the more inclusive we are, the better we can serve our members. 


Important note:

In light of recent increase in hiring scams, if you're selected to move onto the next phase of our hiring process, a member of our Talent Acquisition team will reach out to you directly from an@parsleyhealth.comemail address to guide you through our interview process. 

    Please note: 

  • We will never communicate with you via Microsoft Teams
  • We will never ask for your bank account information at any point during the recruitment process, nor will we send you a check (electronic or physical) to purchase home office equipment

We look forward to connecting!

#LI-Remote

See more jobs at Parsley Health

Apply for this job

+30d

Senior AI Scientist(Taiwan)

GOGOXRemote
airflowsqlDesignazureapijavapythonAWS

GOGOX is hiring a Remote Senior AI Scientist(Taiwan)

Senior AI Scientist(Taiwan) - GoGoX - Career Page

See more jobs at GOGOX

Apply for this job

+30d

Manager, Software Engineering - Data Platform

SamsaraCanada - Remote
Master’s DegreeterraformairflowkubernetesAWS

Samsara is hiring a Remote Manager, Software Engineering - Data Platform

Who we are

Samsara (NYSE: IOT) is the pioneer of the Connected Operations™ Cloud, which is a platform that enables organizations that depend on physical operations to harness Internet of Things (IoT) data to develop actionable insights and improve their operations. At Samsara, we are helping improve the safety, efficiency and sustainability of the physical operations that power our global economy. Representing more than 40% of global GDP, these industries are the infrastructure of our planet, including agriculture, construction, field services, transportation, and manufacturing — and we are excited to help digitally transform their operations at scale.

Working at Samsara means you’ll help define the future of physical operations and be on a team that’s shaping an exciting array of product solutions, including Video-Based Safety, Vehicle Telematics, Apps and Driver Workflows, Equipment Monitoring, and Site Visibility. As part of a recently public company, you’ll have the autonomy and support to make an impact as we build for the long term. 

Recent awards we’ve won include:

Glassdoor's Best Places to Work 2024

Best Places to Work by Built In 2024

Great Place To Work Certified™ 2023

Fast Company's Best Workplaces for Innovators 2023

Financial Times The Americas’ Fastest Growing Companies 2023

We see a profound opportunity for data to improve the safety, efficiency, and sustainability of operations, and hope you consider joining us on this exciting journey. 

Click hereto learn more about Samsara's cultural philosophy.

About the role:

The Samsara Data Platform team owns and develops the analytic platform across Samsara. As a Manager II of Data Platform, you will build and lead teams that maintain our data lake and surrounding infrastructure. You will also be responsible for meeting new business needs, including expanding the platform as the company grows (both in size and geographic coverage), privacy and security needs, and customer-facing feature developments.

You should apply if:

  • You want to impact the industries that run our world: The software, firmware, and hardware you build will result in real-world impact—helping to keep the lights on, get food into grocery stores, and most importantly, ensure workers return home safely.
  • You want to build for scale: With over 2.3 million IoT devices deployed to our global customers, you will work on a range of new and mature technologies driving scalable innovation for customers across industries driving the world's physical operations.
  • You are a life-long learner: We have ambitious goals. Every Samsarian has a growth mindset as we work with a wide range of technologies, challenges, and customers that push us to learn on the go.
  • You believe customers are more than a number:Samsara engineers enjoy a rare closeness to the end user and you will have the opportunity to participate in customer interviews, collaborate with customer success and product managers, and use metrics to ensure our work is translating into better customer outcomes.
  • You are a team player: Working on our Samsara Engineering teams requires a mix of independent effort and collaboration. Motivated by our mission, we’re all racing toward our connected operations vision, and we intend to win—together.

Click hereto learn about what we value at Samsara. 

In this role, you will: 

  • Lead a team of data-focused engineers to build and maintain a stable, scalable, and modern data platform capable of handling petabytes of data. 
  • Help drive long-term planning and establish scalable processes for execution
  • Actively contribute to building the data roadmap for Samsara
  • Stay connected to novel technological developments that suit Samsara’s needs.
  • Champion, role model, and embed Samsara’s cultural principles (Focus on Customer Success, Build for the Long Term, Adopt a Growth Mindset, Be Inclusive, Win as a Team) as we scale globally and across new offices
  • Hire, develop and lead an inclusive, engaged, and high-performing international team

Minimum requirements for the role:

  • BS, MS, or PhD in Computer Science or other related technical degree
  • 2+ years of technical people management experience5+ years of relevant technical experience with data infrastructure
  • Experience building and deploying large-scale data platform systems with feedback loops for continuous improvement
  • Comfortable leading infrastructure development in collaboration with cross functional teams, scientists, and researchers

An ideal candidate also has:

  • MS or PhD in Computer Science or other technical degree
  • Experience with state-of-art data platform technologies such as:
    • AWS (S3 and RDS, SQS, DMS, Dynamo, etc.)
    • Spark a must, Flink, Trino/Presto a plus
    • Data lake file formats such as Delta, Hudi, or Iceberg
    • Python/Scala
    • Container based orchestration services such as Kubernetes, ECS, Fargate, etc.
    • Infrastructure as Code tools (e.g., Terraform)
    • Go is a plus
    • Data orchestration system experience is a plus (e.g., Airflow, Dagster)
  • Proven track record for innovation and delivering value to customers (both internal and external).
  • Demonstrated ability to build cross-functional consensus and drive cross-team collaboration

Samsara’s Compensation Philosophy:Samsara’s compensation program is designed to deliver Total Direct Compensation (based on role, level, and geography) that is at or above market. We do this through our base salary + bonus/variable + restricted stock unit awards (RSUs) for eligible roles.  For eligible roles, a new hire RSU award may be awarded at the time of hire, and additional RSU refresh grants may be awarded annually. 

We pay for performance, and top performers in eligible roles may receive above-market equity refresh awards which allow employees to achieve higher market positioning.

The range of annual base salary for full-time employees for this position is below. Please note that base pay offered may vary depending on factors including your city of residence, job-related knowledge, skills, and experience.
$142,800$184,800 CAD

At Samsara, we welcome everyone regardless of their background. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender, gender identity, sexual orientation, protected veteran status, disability, age, and other characteristics protected by law. We depend on the unique approaches of our team members to help us solve complex problems. We are committed to increasing diversity across our team and ensuring that Samsara is a place where people from all backgrounds can make an impact.

Benefits

Full time employees receive a competitive total compensation package along with employee-led remote and flexible working, health benefits, Samsara for Good charity fund, and much, much more. Take a look at our Benefits site to learn more.

Accommodations 

Samsara is an inclusive work environment, and we are committed to ensuring equal opportunity in employment for qualified persons with disabilities. Please email accessibleinterviewing@samsara.com or click hereif you require any reasonable accommodations throughout the recruiting process.

Flexible Working 

At Samsara, we embrace a flexible working model that caters to the diverse needs of our teams. Our offices are open for those who prefer to work in-person and we also support remote work where it aligns with our operational requirements. For certain positions, being close to one of our offices or within a specific geographic area is important to facilitate collaboration, access to resources, or alignment with our service regions. In these cases, the job description will clearly indicate any working location requirements. Our goal is to ensure that all members of our team can contribute effectively, whether they are working on-site, in a hybrid model, or fully remotely. All offers of employment are contingent upon an individual’s ability to secure and maintain the legal right to work at the company and in the specified work location, if applicable.

Fraudulent Employment Offers

Samsara is aware of scams involving fake job interviews and offers. Please know we do not charge fees to applicants at any stage of the hiring process. Official communication about your application will only come from emails ending in ‘@samsara.com’ or ‘@us-greenhouse-mail.io’. For more information regarding fraudulent employment offers, please visit our blog post here.

Apply for this job

+30d

Senior Business Intelligence Engineer

SquareSan Francisco, CA, Remote
tableauairflowsqlDesignjavamysqlpython

Square is hiring a Remote Senior Business Intelligence Engineer

Job Description

The BI Team at Cash App enables our teams to make impactful business decisions. Our BI Engineers handle everything from data architecture and modeling to data pipeline tooling and dashboarding. As a Senior BI Engineer at Cash App, you will report to the BI Manager and work with Analysts, Data Scientists, Software Engineers and Product Managers to lay the foundation for analyzing our large, unique dataset. We are an extremely data-driven team - from understanding our customers, managing and operating our business, to informing product development. You will build, curate, document, and manage key datasets and ETLs to increase the impact of the entire team.

You will:

  • Create brand new and optimize existing data models for the most widely used Cash App events, entities, and processes
  • Standardize business and product metric definitions in curated and optimized datasets
  • Build pipelines out of our data warehouse
  • Teach (and encourage) others to self-serve while building tools that make it simpler and faster for them to do so
  • Promote data, analytics, and data model design best practices
  • Create dashboards that help our teams understand the performance of the business and help them make decisions

Qualifications

You have:

  • Background/knowledge in Computer Science, Applied Math, Engineering, Stats, Physics, or a something comparable
  • 5+ years of industry experience building complex, scalable ETLs for a variety of different business and product use cases
  • An interest in advancing Cash App's vision of building products for economic empowerment - this should be something that legitimately excites you

Technologies we use and teach:

  • SQL (MySQL, Snowflake, BigQuery, etc.)
  • Airflow, Looker and Tableau
  • Python and Java

See more jobs at Square

Apply for this job

+30d

Middle Product Analyst at HolyWater

GenesisУкраїна Remote
tableauairflowsqlB2CFirebasepythonAWS

Genesis is hiring a Remote Middle Product Analyst at HolyWater

See more jobs at Genesis

Apply for this job

+30d

Junior Analytics Engineer (HolyWater)

GenesisКиїв, UA Remote
tableauterraformairflowsqlpython

Genesis is hiring a Remote Junior Analytics Engineer (HolyWater)

See more jobs at Genesis

Apply for this job

+30d

Sr. Site Reliability Engineer IV

Signify HealthDallas TX, Remote
terraformairflowDesignmobileazurec++kubernetespythonAWS

Signify Health is hiring a Remote Sr. Site Reliability Engineer IV

How will this role have an impact?

Join Signify Health's vibrant Site Reliability Engineering team as a Site Reliability Engineer. We’re seeking passionate individuals from diverse technical backgrounds. Reporting to the Manager of Site Reliability Engineering, we offer a collaborative environment that values each team member's unique contribution and fosters an inclusive culture.

Your Role:

  • Developing strategies to improve the stability, scalability, and availability of our products.
  • Maintain and deploy observability solutions to optimize system performance.
  • Collaborate with cross-functional teams to enhance operational processes and service management.
  • Design, build, and maintain application stacks for product teams.
  • Create sustainable systems and services through automation.

Skills We’re Seeking:

  • An eagerness to collaborate with and mentor others in the field of Site Reliability Engineering.
  • Strong familiarity with cloud environments (Azure, AWS, or GCP) and a desire to develop further expertise.
  • Advanced understanding of scripting languages, preferably with experience with Bash or Python, and programming languages, preferably with experience with Golang.
  • Advanced grasp of infrastructure as code, preferably with experience with Terraform.
  • Advanced understanding of Kubernetes and containerization technologies.
  • Advanced understanding of CI/CD principles and willingness to guide and enforce best practices.
  • Advanced understanding of Site Reliability and observability principles, preferably with experience with New Relic.
  • A proactive approach to identifying problems, performance bottlenecks, and areas for improvement.

The base salary hiring range for this position is $108,900 to $189,700. Compensation offered will be determined by factors such as location, level, job-related knowledge, skills, and experience. Certain roles may be eligible for incentive compensation, equity, and benefits.
In addition to your compensation, enjoy the rewards of an organization that puts our heart into caring for our colleagues and our communities.  Eligible employees may enroll in a full range of medical, dental, and vision benefits, 401(k) retirement savings plan, and an Employee Stock Purchase Plan.  We also offer education assistance, free development courses, paid time off programs, paid holidays, a CVS store discount, and discount programs with participating partners.  

About Us:

Signify Health is helping build the healthcare system we all want to experience by transforming the home into the healthcare hub. We coordinate care holistically across individuals’ clinical, social, and behavioral needs so they can enjoy more healthy days at home. By building strong connections to primary care providers and community resources, we’re able to close critical care and social gaps, as well as manage risk for individuals who need help the most. This leads to better outcomes and a better experience for everyone involved.

Our high-performance networks are powered by more than 9,000 mobile doctors and nurses covering every county in the U.S., 3,500 healthcare providers and facilities in value-based arrangements, and hundreds of community-based organizations. Signify’s intelligent technology and decision-support services enable these resources to radically simplify care coordination for more than 1.5 million individuals each year while helping payers and providers more effectively implement value-based care programs.

To learn more about how we’re driving outcomes and making healthcare work better, please visit us at www.signifyhealth.com

Diversity and Inclusion are core values at Signify Health, and fostering a workplace culture reflective of that is critical to our continued success as an organization.

We are committed to equal employment opportunities for employees and job applicants in compliance with applicable law and to an environment where employees are valued for their differences.

See more jobs at Signify Health

Apply for this job

+30d

Google Pillar | Data Project

DevoteamLisbon, Portugal, Remote
Bachelor degreeterraformairflowsqljavadockerpython

Devoteam is hiring a Remote Google Pillar | Data Project

Job Description

Devoteam G Cloud is our Google strategy and identity within the group Devoteam. We focus on developing solutions end to end within Google Cloud Platform and its technologies.

Our Devoteam G Cloud is looking for Cloud Data Engineers to join our Data Engineer specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam and CI/CD automatisms; 
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP.

Qualifications

  • Bachelor degree in IT or similar;
  • More than 2 years of professional experience, with expertise in the delivery of Data Engineering projects;
  • GCP Data Services, BigQuery, Looker, Cloud Storage, Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Knowledge of programing languages: Python, Java, or SQL;
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • Code-review mindset;
  • Experience with Terraform, GitHub, Github Actions, Bash and/or Docker will be valued;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have);
  • Proficiency in English (written and spoken).

See more jobs at Devoteam

Apply for this job

+30d

Google Pillar | Data Architect

DevoteamLisbon, Portugal, Remote
Bachelor degreeterraformairflowsqljavadockerpython

Devoteam is hiring a Remote Google Pillar | Data Architect

Job Description

Devoteam G Cloud is our Google strategy and identity within the group Devoteam. We focus on developing solutions end to end within Google Cloud Platform and its technologies.

Our Devoteam G Cloud is looking for Cloud Data Engineers to join our Data Engineer specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam and CI/CD automatisms; 
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP.

Qualifications

  • Bachelor degree in IT or similar;
  • More than 2 years of professional experience, with expertise in the delivery of Data Engineering projects;
  • GCP Data Services, BigQuery, Looker, Cloud Storage, Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Knowledge of programing languages: Python, Java, or SQL;
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • Code-review mindset;
  • Experience with Terraform, GitHub, Github Actions, Bash and/or Docker will be valued;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have);
  • Proficiency in English (written and spoken).

See more jobs at Devoteam

Apply for this job

+30d

Senior Data Engineer

InstacartCanada - Remote
airflowsqlDesign

Instacart is hiring a Remote Senior Data Engineer

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

Overview

 

At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.

About the Role 

 

The Finance data engineering team plays a critical role in defining how financial data is modeled and standardized for uniform, reliable, timely and accurate reporting. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping the post-IPO financial data vision and roadmap for the company.

 

About the Team 

 

Finance data engineering is part of the Infrastructure Engineering pillar, working closely with accounting, billing & revenue teams to support the monthly/quarterly book close, retailer invoicing and internal/external financial reporting. Our team collaborates closely with product teams to capture critical data needed for financial use cases.

 

About the Job 

  • You will be part of a team with a large amount of ownership and autonomy.
  • Large scope for company-level impact working on financial data.
  • You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
  • You will ship high quality, scalable and robust solutions with a sense of urgency.
  • You will have the freedom to suggest and drive organization-wide initiatives.

 

About You

Minimum Qualifications

  • 6+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
  • Expert with SQL and  knowledge of Python.
  • Experience building high quality ETL/ELT pipelines.
  • Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
  • Experience building data pipelines for accounting/billing purposes.
  • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
  • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
  • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
  • Experience working with a large codebase on a cross functional team.

 

Preferred Qualifications

  • Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
  • Experience with Snowflake, dbt (data build tool) and Airflow
  • Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo etc

 

#LI-Remote

See more jobs at Instacart

Apply for this job

+30d

Jr. Back-End/Data Engineer: Practica

Bachelor's degreeremote-firstairflowsqlapigitelasticsearchpythonAWS

The Lifetime Value Co. is hiring a Remote Jr. Back-End/Data Engineer: Practica

Jr. Back-End/Data Engineer: Practica - The Lifetime Value Co. - Career PageSee more jobs at The Lifetime Value Co.

Apply for this job

+30d

Jr. Back-End/Data Engineer: Internship

airflowsqlapigitelasticsearchpythonAWS

The Lifetime Value Co. is hiring a Remote Jr. Back-End/Data Engineer: Internship

Jr. Back-End/Data Engineer: Internship - The Lifetime Value Co. - Career PageLi

See more jobs at The Lifetime Value Co.

Apply for this job

+30d

Senior Data Engineer (Automated restaurant system)

Sigma SoftwareKyiv, Ukraine, Remote
airflowsqloracleDesignazureAWS

Sigma Software is hiring a Remote Senior Data Engineer (Automated restaurant system)

Job Description

  • Design, develop, and maintain ETL pipelines using Azure Data Factory, ensuring efficient data flow and transformation
  • Utilize Databricks for data processing tasks, leveraging PySpark for advanced data manipulation and analysis
  • Develop and optimize ETL processes within Azure Synapse Analytics, focusing on performance and scalability
  • Apply expertise in MS SQL Server for data modeling, implementing stored procedures and analytical views to support business requirements
  • Create visually appealing and insightful reports using Microsoft Power BI, enabling stakeholders to derive actionable insights from the data

Qualifications

  • At least 5 years of professional experience in data engineering
  • Advanced experience with enterprise Data Warehouse modeling
  • Experience with Complex ETL process implementation
  • Experience with Power BI Data Modeling and Reporting (DAX, Power Query, Dashboards Design)
  • Experience with SQL Development (both MS SQL and Oracle)
  • Solid experience with Azure Synapse/Data Factory Pipelines, Azure Notebooks/Databricks
  • Familiarity with Cloud stack technology (AWS, Airflow)
  • Deep understanding of SAP BW/HANA Modeling, SAP ECC/S4 Corporate Data Extraction
  • Experience with PySpark Data Processing
  • Experience with CI/CD
  • At least Upper - Intermediate level of written and spoken English

See more jobs at Sigma Software

Apply for this job

+30d

Senior Data Engineer

Procore TechnologiesBangalore, India, Remote
Bachelor's degreetableauairflowsqlAWSjavascript

Procore Technologies is hiring a Remote Senior Data Engineer

Job Description

We’re looking for a Data Engineer to join Procore’s data platform and who wants to make an impact.  This is a great opportunity to work with an amazing team and help them achieve their mission to deliver actionable insights from data to the business. As a member of our Data team, you’ll help define the future of our next-generation global data infrastructure to help Procore connect everyone in construction on a worldwide platform. 

The construction vertical is ripe for technological innovation. Construction impacts the lives of nearly everyone in the world, and yet it’s one of the least digitized industries, not to mention one of the most dangerous. Procore is leading the market with our SaaS construction platform. We build for real people with real experiences, empowering Groundbreakers to develop and transform the communities where we all live.

What you’ll do:

  • As a Data engineer, you will work closely with Architects, Data scientists, Product Managers and analysts to understand the needs of the business and deliver actionable insights via Data products
  • Use your technical knowledge to help drive scalable, performant data solutions by articulating how our solution meets the needs of our customers and partners
  • Partner closely with cross-functional platform teams, and stakeholders on data management, data governance, reliability, security, and quality to deliver a data platform that scales with Procore’s growth
  • Build and maintain batch and streaming data pipelines, CI/CD pipelines using cloud technologies technologies
  • Build data visualizations using tools such as Tableau, Power BI or Looker to make data driven decisions
  • Partner with the internal consumers of data to define the requirements and success criteria to help them achieve their mission to deliver actionable insights from data to the business.
  • Maintain the user stories in a prioritized backlog and effectively divide overall project goals into prioritized deliverables for execution.
  • Participate in daily standups, team meetings, sprint planning, and demo/retrospectives while working cross-functionality with other teams to drive the innovation of our products 
  • Develop user experience designs in the form of wireframes and mock-ups for stakeholder validation and execution by the team.

What we’re looking for:

  • Bachelor's degree in Computer Science or a similar technical field of study 
  • Strong expertise in Data Engineering with 3+ years of experience in building efficient and scalable data infrastructure, data pipelines, automated deployments, data lifecycle policies using modern data stack such as Snowflake, Airflow, Spark, dbt, AWS, Gitlab, Databricks
  • Develop and maintain tables and data models in SQL, abstracting multiple sources and historical data across varied schemas to a format suitable for reporting and analysis
  • Experience in extracting, processing and storing structured and unstructured data
  • Experience building integrations between enterprise systems to automate manual processes
  • Strong experience with AWS services including EC2, EKS, ECS, Lambda, S3, Opensearch, Kafka, MWAA, Cloud watch
  • Experience building data visualizations in Tableau, Javascript or PowerBI to improve operational efficiencies and foster data driven decision making
  • Exceptional Communication skills including working with onshore and offshore stakeholders
  • Strong interpersonal skills with the ability to manage ambiguity and conflicts

Qualifications

See more jobs at Procore Technologies

Apply for this job

+30d

Data Scientist - Support

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Data Scientist - Support

Job Description

The Cash App Support organization is growing and we are looking for a Data Scientist (DS) to join the team. The DS team at Cash derives valuable insights from our extremely unique datasets and turn those insights into actions that improve the experience for our customers every day. In this role, you’ll be embedded in our Support org and work closely with operations and other cross-functional partners to drive meaningful change for how our customers interact with the Support team and resolve issues with their accounts. 

You will:

  • Partner directly with a Cash App customer support team, working closely with operations, engineers, and machine learning
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the operations team and other key stakeholders
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand advocate and customer behavior
  • Design and analyze A/B experiments to evaluate the impact of changes we make to our operational processes and tools
  • Work with engineers to log new, useful data sources as we evolve processes, tooling, and features
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • An appreciation for the connection between your work and the experience it delivers to customers. Previous exposure to or interest in customer support problems would be great to have
  • A bachelor degree in statistics, data science, or similar STEM field with 5+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 2+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Looker, Tableau, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Experience with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities
  • Experience in a high-growth tech environment

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Looker, Mode, Tableau, Prefect, Airflow

See more jobs at Square

Apply for this job

+30d

Analyst 2, Data Analytics

Western DigitalBatu Kawan, Malaysia, Remote
Bachelor's degreeairflowmariadbsqloracleapidockercsskubernetesjenkinspythonjavascriptPHP

Western Digital is hiring a Remote Analyst 2, Data Analytics

Job Description

  • Excellent interpersonal communication and organizational skills to contribute as a leading member of global, distributed teams focused on delivering quality services and solutions
  • Able to distill complex technical challenges to actionable and explainable decisions
  • Work in DevOps teams by building consensus and mediating compromises when necessary
  • Demonstrate excellent engineering & automation skills in the context of application development using continuous integration (CI) and continuous deployment (CD)
  • Demonstrate ability to rapidly learn new and emerging technologies with ability to rapidly define engineering standards and produce automation code
  • Operational abilities including early software release support and driving root cause analysis and remediation
  • Ability to work with and engage multiple functional groups

Qualifications

  • Bachelor's Degree in Computer Science, Software Engineering, Computer Engineering or a related field or equivalent work experience. 
  • 5+ years overall IT industry experience
  • 3+ years in an engineering role using service and hosting solutions such as factory dashboards
  • 3+ years functional knowledge of server-side languages: Python, PHP
  • 3+ years functional knowledge of client-side programming: JavaScript, HTML, CSS
  • Experience with relational SQL database: MariaDB, MSSQL, Oracle
  • Experience with data pipeline and workflow management tools: Airflow
  • Solid understanding of containerization and orchestration tools: Docker, Kubernetes
  • Experience with version control systems: BitBucket
  • Experience with Dash framework (Python web framework) for building interactive web applications
  • Exposure to Common Web Frameworks & REST API’s
  • Experience with continuous integration and deployment using Jenkins is a plus

See more jobs at Western Digital

Apply for this job

+30d

Senior Software Engineer, Data

JW PlayerUnited States - Remote
agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

JW Player is hiring a Remote Senior Software Engineer, Data

About JWP:

JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

The Data Engineering Team: 

At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

The Opportunity: 

We are looking to bring on a Senior Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

As a Senior Data Engineer, you will:

  • Contribute to the development of distributed batch and real-time data infrastructure.
  • Mentor and work closely with junior engineers on the team. 
  • Perform code reviews with peers. 
  • Lead small to medium sized projects, documenting and ticket writing the projects. 
  • Collaborate closely with Product Managers, Analyst, and cross-functional teams to gather insights and drive innovation in data products. 

Requirements for the role:

  • Minimum 5+ years of backend engineering experience with a passionate interest for big data.
  • Expertise with Python or Java and SQL. 
  • Familiarity with Kafka
  • Experience with a range of datastores, from relational to key-value to document
  • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

Bonus Points:

  • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
  • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
  • Familiarity with Snowflake
  • Familiarity with Elasticsearch
  • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
  • Experience with Docker, Kubernetes, and application monitoring tools
  • Experience and/or training with agile methodologies
  • Familiarity with Airflow for task and dependency management

Perks of being at JWP, United States

Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

As a full time employee, you will qualify for:

  • Private Medical, Vision and Dental Coverage for you and your family
  • Unlimited Paid Time Off
  • Stock Options Purchase Program
  • Quarterly and Annual Team Events
  • Professional Career Development Program and Career Development Progression
  • New Employee Home Office Setup Stipend
  • Monthly Connectivity Stipend
  • Free and discounted perks through JW Player's benefit partners
  • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
  • Fireside chats with individuals throughout JW Player

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at JW Player

Apply for this job