airflow Remote Jobs

100 Results

2d

Data Engineer

CalmRemote, United States
remote-firstterraformairflowpostgressqlapipythonAWS

Calm is hiring a Remote Data Engineer

About Calm

Calm is on a mission to support everyone on every step of their mental health journey. With the #1 app for sleep, meditation and relaxation as well as a growing library of digital, evidence-based mental health programs, Calm offers trusted support for individuals and organizations alike. Our flagship consumer app provides personalized content and activities – featuring a range of experts and beloved celebrity voices – to help users manage stress, improve sleep and live mindfully. Our workplace and healthcare solutions offer a consumer-friendly approach to clinical content and HIPAA-compliant resources in order to drive positive health and business outcomes. Named a TIME100 Most Influential Company, Calm supports more than 150 million people and 3,500 organizations across seven languages and 190 countries.

What We Do

As a data organization, we focus on making data a competitive advantage for Calm. We’re product-minded, team-oriented, and grounded in our mission of making the world a happier and healthier place. We work closely with teams across the company such as product, finance, marketing, data science, and more. As a team, we strive to always improve.

 

What You’ll Do

We’re looking for someone who is comfortable with ambiguity, assesses what needs to be done, and delivers with the right balance of velocity and technical debt. As a Data Engineer, you’ll leverage all sorts of data, from application event streams to product databases to third-party data, to help stakeholders create products and answer business questions. Our stack spans AWS and GCP, with technologies like Airflow, Redshift, BigQuery, Postgres, Spark, and dbt. Specifically, you will: 

  • Work with business stakeholders to understand their goals, challenges, and decisions
  • Assist with building solutions that standardize our data approach to common problems across the company
  • Incorporate observability and testing best practices into projects
  • Assist in the development of processes to ensure our data is trusted and well-documented
  • Effectively work with data analysts on refining the data model used for reporting and analytical purposes
  • Improve availability and consistency of data points crucial for analysis

Some past projects include:

  • Standing up a reporting system in BigQuery from scratch, including data replication, infrastructure setup, dbt model creation, and integration with reporting endpoints
  • Creating a user-level feature store and related API endpoints to support machine learning tasks such as content recommendation and persona creation
  • Remodeling a critical data pipeline to decrease our model count by 50% and reduce run time by 83%
  • Setting up scalable APIs to integrate our Data Warehouse with 3rd party applications for personalization that reaches tens of millions of customers
  • Revamping orchestration and execution to reduce critical data delivery times by 70%

Who You Are

  • Proficiency with SQL and an object-oriented language
  • Experience with RDBMS, data warehouses, and event systems
  • Experience in building data pipelines that scale
  • Ability to translate non-technical business requirements into technical solutions, and translate technical solutions to business outcomes
  • Strong communication skills
  • Pragmatism: balancing scrappiness and rigor

 

Nice to Haves

  • Python programing experience
  • Experience with data lakes
  • Experience building across clouds
  • Some experience in Infrastructure as Code tools like Terraform
  • Knowledge of different data modeling paradigms, e.g. relational, data vault, and medallion

Minimum Requirements

  • This role typically requires 5+ years of related experience

 

The anticipated salary range for this position is $159,300- $223,000. The base salary range represents the low and high end of Calm’s salary range for this position. Not all candidates will be eligible for the upper end of the salary range. Exact salary will ultimately depend on multiple factors, which may include the successful candidate's geographic location, skills, experience and other qualifications. This role is also eligible for equity + comprehensive benefits + 401k + flexible time off. 

Please note that Calm may leverage artificial intelligence technology in the application review process.

Calm is committed to providing reasonable accommodations for qualified individuals with disabilities, including disabled veterans. Please contact Calm’s Recruiting team if you need a reasonable accommodation, assistance completing any forms, or to otherwise participate in the application process. You can reach the Recruiting team at recruitingaccommodations@calm.com 

We believe that mental health is health, and every person should be considered in the discussion. That’s why we’re proud to be an equal opportunity workplace, committed to providing equal employment opportunities to all applicants and employees regardless of race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, medical condition, genetic information, military or veteran status, gender identity or expression, sexual orientation, or any other characteristic protected by applicable federal, state or local law.

Calm is deeply committed to diversity, equity and inclusion. We strive to create a mindful and respectful environment where everyone can bring their authentic self to work, and experience a culture that is free of harassment, racism, and discrimination.


Calm participates in e-verify. E-verify provides the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.

#LI-Remote

See more jobs at Calm

Apply for this job

2d

Senior Data Engineer

CalmRemote, United States
remote-firstterraformairflowpostgressqlapipythonAWS

Calm is hiring a Remote Senior Data Engineer

About Calm

Calm is on a mission to support everyone on every step of their mental health journey. With the #1 app for sleep, meditation and relaxation as well as a growing library of digital, evidence-based mental health programs, Calm offers trusted support for individuals and organizations alike. Our flagship consumer app provides personalized content and activities – featuring a range of experts and beloved celebrity voices – to help users manage stress, improve sleep and live mindfully. Our workplace and healthcare solutions offer a consumer-friendly approach to clinical content and HIPAA-compliant resources in order to drive positive health and business outcomes. Named a TIME100 Most Influential Company, Calm supports more than 150 million people and 3,500 organizations across seven languages and 190 countries.

What We Do

As a data organization, we focus on making data a competitive advantage for Calm. We’re product-minded, team-oriented, and grounded in our mission of making the world a happier and healthier place. We work closely with teams across the company such as product, finance, marketing, data science, and more. As a team, we strive to always improve.

What You’ll Do

We’re looking for someone who is comfortable with ambiguity, assesses what needs to be done, and delivers with the right balance of velocity and technical debt. As a Senior Data Engineer, you’ll leverage all sorts of data, from application event streams to product databases to third-party data, to help stakeholders create products and answer business questions. Our stack spans AWS and GCP, with technologies like Airflow, Redshift, BigQuery, Postgres, Spark, and dbt. Specifically, you will: 

  • Work with business stakeholders to understand their goals, challenges, and decisions
  • Identify opportunities and build solutions that standardize our data approach to common problems across the company
  • Evangelize the use of data-driven decision making across the organization
  • Build processes to ensure our data is trusted and well-documented
  • Partner with data analysts on refining the data model used for reporting and analytical purposes
  • Collaborate with engineering on improving availability and consistency of data points crucial for analysis and represent data team in architectural discussions
  • Develop, mentor and train data engineers

Some past projects include:

  • Standing up a reporting system in BigQuery from scratch, including data replication, infrastructure setup, dbt model creation, and integration with reporting endpoints
  • Creating a user-level feature store and related API endpoints to support machine learning tasks such as content recommendation and persona creation
  • Remodeling a critical data pipeline to decrease our model count by 50% and reduce run time by 83%
  • Setting up scalable APIs to integrate our Data Warehouse with 3rd party applications for personalization that reaches tens of millions of customers
  • Revamping orchestration and execution to reduce critical data delivery times by 70%

Who You Are

  • Proficiency with SQL and an object-oriented language
  • Experience with RDBMS, data warehouses, and event systems
  • Experience in building data pipelines that scale
  • Knowledge of different data modeling paradigms, e.g. relational, data vault, and medallion
  • Ability to translate non-technical business requirements into technical solutions, and translate technical solutions to business outcomes
  • Strong communication skills
  • Pragmatism: balancing scrappiness and rigor

Nice to Haves

  • Python programing experience
  • Experience with data lakes
  • Experience building across clouds
  • Some experience in Infrastructure as Code tools like Terraform

Minimum Requirements

  • This role typically requires 8+ years of related experience

The anticipated salary range for this position is $185,500- $259,700. The base salary range represents the low and high end of Calm’s salary range for this position. Not all candidates will be eligible for the upper end of the salary range. Exact salary will ultimately depend on multiple factors, which may include the successful candidate's geographic location, skills, experience and other qualifications. This role is also eligible for equity + comprehensive benefits + 401k + flexible time off. 

Please note that Calm may leverage artificial intelligence technology in the application review process.

Calm is committed to providing reasonable accommodations for qualified individuals with disabilities, including disabled veterans. Please contact Calm’s Recruiting team if you need a reasonable accommodation, assistance completing any forms, or to otherwise participate in the application process. You can reach the Recruiting team at recruitingaccommodations@calm.com 

We believe that mental health is health, and every person should be considered in the discussion. That’s why we’re proud to be an equal opportunity workplace, committed to providing equal employment opportunities to all applicants and employees regardless of race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, medical condition, genetic information, military or veteran status, gender identity or expression, sexual orientation, or any other characteristic protected by applicable federal, state or local law.

Calm is deeply committed to diversity, equity and inclusion. We strive to create a mindful and respectful environment where everyone can bring their authentic self to work, and experience a culture that is free of harassment, racism, and discrimination.


Calm participates in e-verify. E-verify provides the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.

#LI-Remote

See more jobs at Calm

Apply for this job

3d

Data Engineer with TS/SCI Clearance

Maania Consultancy ServicesPartial Remote/Washington DC, DC
agilejiraairflowsqldockerelasticsearchpostgresqlkubernetesAWSjavascript

Maania Consultancy Services is hiring a Remote Data Engineer with TS/SCI Clearance

Data Engineer with TS/SCI Clearance - Maania Consultancy Services - Career Page

See more jobs at Maania Consultancy Services

Apply for this job

5d

Senior Data Engineer

Nile BitsCairo, Egypt, Remote
agileairflowsqlDesigndockerlinuxpythonAWS

Nile Bits is hiring a Remote Senior Data Engineer

Job Description

  • Designing and implementing core functionality within our data pipeline in order to support key business processes
  • Shaping the technical direction of the data engineering team
  • Supporting our Data Warehousing approach and strategy
  • Maintaining our data infrastructure so that our jobs run reliably and at scale
  • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
  • Mentoring and upskilling other members of the team

Qualifications

  • Experience building data pipelines and/or ETL processes
  • Experience working in a Data Engineering role
  • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
  • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
  • Confident writing SQL and good understanding of database design.
  • Experience working with web APIs.
  • Experience leading projects from a technical perspective
  • Knowledge of Docker, shell scripting, working with Linux
  • Experience with a cloud data warehouse
  • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
  • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
  • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
  • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
  • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
  • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
  • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
  • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
  • Experience with AWS services or those of another cloud provider
  • Experience with Snowflake
  • Good understanding of Agile

See more jobs at Nile Bits

Apply for this job

7d

Senior Site Reliability Engineer

AdwerxDurham, NC Remote
terraformairflowRabbitMQDesignqarubydockermysqlkubernetesNode.js

Adwerx is hiring a Remote Senior Site Reliability Engineer

Durham, NC or Remote

Adwerx is on the lookout for a Site Reliability Engineer to join our small and talented infrastructure team and help us design, build, and automate performant, resilient, and highly-available systems that our teams and customers rely on. In this role, you’ll help us run a handful of mature (and in some cases brand-new) services in the cloud and apply your skills to make them resilient, performant, and highly-available during the rapid adoption of our products. The infrastructure you’ll build has a large impact on an organization that is focused on software development best practices and standards.

The starting title for this experienced role will be based on tenure/experience/work history.

Our culture

Adwerx is a place where you can thrive in our highly collaborative teams and where everyone is encouraged to contribute ideas across all levels of the organization.

Our engineering charter is centered around humility, respect and trust. We abide by the mantra “if it’s not in version control, it doesn’t exist”, strive to write documentation our peers will love, and always try to leave things better than we found it. We employ testing and continuous delivery for all our services and empower our developers to iterate and deploy as often as they need.

Infrastructure engineers share an on-call schedule, but our systems are stable and fire drills are rare. We host lunch and learns, conduct blameless post-mortems and regularly recognize our peers with shout outs and a fun badge program to recognize leaders in specific technical disciplines.

How we work

We apply the Agile/Scrum methodology to run the day to day projects at Adwerx and are heavily inspired by the “Shape Up” process with our product development process. In addition we:

  • Utilize a mature CI/CD process and deploy to production many times a day.
  • Have production-like QA environments with a culture of writing automated tests.
  • Define department SLOs and Engineering KPIs to better understand how we work..
  • Relentlessly strive for excellence with not only the products we build but also the health of our codebase and our developer ecosystem.

Technologies we work with

  • Our primary application is built with Ruby on Rails. You’ll also encounter or work with Node.js, Go, and Python.
  • Our production systems run primarily in Google Cloud Platform though we also have a small footprint in Amazon Web Services
  • Besides our primary application, some services you will support include our VPN/Tailscale, CI/CD pipelines, Google Kubernetes Engine Clusters, MySQL databases, Airflow, RabbitMQ, and Redshift
  • Some tools we use include Terraform, Kubernetes, Datadog, Helm, Nginx, docker, NewRelic, and CircleCI

In this mission-critical role, you will:

  • Design, build, and maintain the core infrastructure for Adwerx
  • Create, maintain, and/or iterate on various workloads in Google Kubernetes Engine
  • Contribute to the Ruby on Rails monolith to upgrade dependencies, integrate with infrastructure features, or optimize performance
  • Maintain reliable network paths and connections between all external and internal services (DNS, VPN, VPC peering)
  • Participate and run point in handling production incidents
  • Participate in solution design for new features, products, systems, and tooling
  • Find new ways to use existing systems to improve scalability and performance for our platform
  • Interact with the larger organization to ensure the uptime and reliability of our infrastructure
  • Iterate on security standards and reviewing code for secure coding practices
  • Partner with engineering teams closely to educate and consult
  • Continually monitor application/system performance and costs (SLOs), generate actionable insights and either implement or advocate for them
  • Participate in on-call rotations, along with every member of the engineering team
  • Work closely with engineering teams to conduct root cause analyses for production incidents and make plans to remediate or prevent recurrences
  • Collaboratively plot the course and document Adwerx infrastructure
  • Build a great customer experience for people using your infrastructure

What You’ll Get:

  • Competitive salary and potential for equity.
  • Comprehensive medical, dental, and vision plan options (100% of basic plan premiums paid by company)
  • 401(k) plan with a company match of up to 4%
  • A collaborative work environment where you’ll learn about and influence every aspect of the business
  • The opportunity to work with and learn from talented leaders, developers, marketers and designers and advancement opportunities.
  • The ability to help define the foundational technology that will power the growth of our business
  • Flexible work scheduling

See more jobs at Adwerx

Apply for this job

8d

Software Engineer, Data

JW PlayerUnited States - Remote
agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

JW Player is hiring a Remote Software Engineer, Data

About JWP:

JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

The Data Engineering Team: 

At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

The Opportunity: 

We are looking to bring on a Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

As a Data Engineer, you will:

  • Contribute to the development of distributed batch and real-time data infrastructure.
  • Mentor and work closely with junior engineers on the team. 
  • Perform code reviews with peers. 
  • Lead small to medium sized projects, documenting and ticket writing the projects. 
  • Collaborate closely with Product Managers, Analysts, and cross-functional teams to gather insights and drive innovation in data products. 

Requirements for the role:

  • Minimum 3+ years of backend engineering experience with a passionate interest for big data.
  • Expertise with Python or Java and SQL. 
  • Familiarity with Kafka
  • Experience with a range of datastores, from relational to key-value to document
  • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

Bonus Points:

  • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
  • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
  • Familiarity with Snowflake
  • Familiarity with Elasticsearch
  • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
  • Experience with Docker, Kubernetes, and application monitoring tools
  • Experience and/or training with agile methodologies
  • Familiarity with Airflow for task and dependency management

Perks of being at JWP, United States

Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

As a full time employee, you will qualify for:

  • Private Medical, Vision and Dental Coverage for you and your family
  • Unlimited Paid Time Off
  • Stock Options Purchase Program
  • Quarterly and Annual Team Events
  • Professional Career Development Program and Career Development Progression
  • New Employee Home Office Setup Stipend
  • Monthly Connectivity Stipend
  • Free and discounted perks through JWP's benefit partners
  • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
  • Fireside chats with individuals throughout JWP

*Benefits are subject to location and can change at the discretion of the Company. 

Check out our social channels:

    

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at JW Player

Apply for this job

9d

Technical Product Manager II - Data Systems

Torc RoboticsRemote - US
Bachelor's degreenosqlairflowc++pythonAWS

Torc Robotics is hiring a Remote Technical Product Manager II - Data Systems

About the Company

At Torc, we have always believed that autonomous vehicle technology will transform how we travel, move freight, and do business.

A leader in autonomous driving since 2007, Torc has spent over a decade commercializing our solutions with experienced partners. Now a part of the Daimler family, we are focused solely on developing software for automated trucks to transform how the world moves freight.

Join us and catapult your career with the company that helped pioneer autonomous technology, and the first AV software company with the vision to partner directly with a truck manufacturer.

Meet the Team:

This team is looking for someone with the combined technical and management experience to create the next generation of on-road autonomous driving technology, meeting customer needs as effectively and efficiently as possible. Works closely with cross-functional teams to define actionable work plans to achieve product milestones, integrate plans with organizational leaders, obtain feedback from stakeholders, and maintain a backlog of work that reflects end-user requirements and company objectives. Owns the product development plan across relevant engineering teams, represents the plan to executive management, and serves as a subject matter expert.

What you'll do:

  • Accountable for ensuring consistent delivery of critical projects that meet the company's objectives, coordinating efforts and resources, identifying, and removing organizational barriers to execution as the internal representative of the customer to the engineering organization.
  • Responsible for analyzing and communicating product initiatives to development teams, leadership, and other stakeholders. Break down the product initiatives into department-level quarterly plans.
  • Analyze top-level product requirements to develop associated department-level plans for components that advance the virtual driver product towards commercial launch.
  • Create strong working relationships and maintain effective communication across all product development teams, informing others of new features and release timelines, and soliciting frequent feedback from stakeholders for action.
  • Build trust and strong working relationships with the technical subject matter experts and principal engineers that steward the technical roadmap.
  • Consult with relevant technical experts and synthesize their input with your own technical and industry knowledge to create and define a balanced product development backlog which drives teams towards the most cost-effective path to commercial launch.
  • Cohesively and succinctly present quarterly departmental plans based on cross-functional dependencies for evaluation and approval by engineering leadership at the executive level.
  • Represent product development plans, showcase recent development efforts in regular reviews, and synthesize changes in urgency to resolve product issues within the applicable department.
  • Partner with engineering leadership to define timelines/milestones, identify/mitigate risks, recommend resource allocation, track progress, resolve dependencies, and resolve bottlenecks.
  • Set clear, actionable development team workloads that deliver epic acceptance criteria that cleanly support initiative-level acceptance criteria, delegate tasks and set deadlines.
  • Monitor epic status and report metrics for management recognition. Clarify metrics and measurable goals to identify work, clarify urgency and importance, follow-up and verify solutions/results, measure success, review for continuous improvement and effective time management.
  • Provide candid and constructive feedback to development team members that maximizes team goals and individual employee potential. Report any issues of concern to managers.
  • Create a motivating and inspiring development team environment with an open communication culture. Act proactively to ensure smooth development team operations and effective collaboration.
  • Drive a culture of continuous improvement. Identify process gaps and/or new areas of focus for continuously improving engineering planning, communication, and execution.

What you’ll need to Succeed:

  • Demonstrates competences and technical proficiencies typically acquired through:
    • BS+ 4+ years of experience OR MS+ 3+ years of experience
  • Considered highly skilled and proficient in discipline; conducts complex, important work under minimal supervision and with wide latitude for independent judgment. 
  • We are looking for a TPM with the experience in one of the following fields: 
    • Data Systems
      • Experience with different database architectures, including but not limited to relational and NoSQL databases, data warehousing and clustered, distributed data stores.
      • Experience with Python libraries for applied data science (Pandas, Plotly, Matplotlib, Dask, TensorFlow).
      • Experience with AWS serverless architectures (Lambda, Batch, ECS Fargate, Glue, Athena).
      • Experience with workflow patterns using directed acyclic graphs (Apache Airflow, AWS Step Functions).
      • Familiarity with machine learning operations and the data and artifacts that they produce.

Bonus Points!

  • Experience with different mapping formats, strategies, and developing pipelines to deploy maps to different environments.

Perks of Being a Full-time Torc’r

Torc cares about our team members and we strive to provide benefits and resources to support their health, work/life balance, and future. Our culture is collaborative, energetic, and team focused. Torc offers:  

  • A competitive compensation package that includes a bonus component and stock options
  • 100% paid medical, dental, and vision premiums for full-time employees  
  • 401K plan with a 6% employer match
  • Flexibility in schedule and generous paid vacation (available immediately after start date)
  • Company-wide holiday office closures
  • AD+D and Life Insurance 
Hiring Range for Job Opening 
US Pay Range
$153,200$183,800 USD

At Torc, we’re committed to building a diverse and inclusive workplace. We celebrate the uniqueness of our Torc’rs and do not discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, veteran status, or disabilities.

Even if you don’t meet 100% of the qualifications listed for this opportunity, we encourage you to apply. We’re always looking for those that are hungry, humble, and people smart and your unique experience may be a great fit for this role or others.

See more jobs at Torc Robotics

Apply for this job

11d

Junior Analytics Engineer at HolyWater

GenesisКиїв, UA Remote
tableauterraformairflowsqlpython

Genesis is hiring a Remote Junior Analytics Engineer at HolyWater

ПІДТРИМУЄМО УКРАЇНУ ????????

Holy Water засуджує війну росії проти України й допомагає державі. На початку повномасштабної війни ми запустили продаж NFT-колекції про події в Україні, щоб зібрати 1 млн доларів на потреби української армії, а також долучилися до корпоративного благодійного фонду Genesis for Ukraine. Команда фонду закуповує необхідне спорядження, техніку й медикаменти для співробітників та їхніх родичів, що захищають країну на передовій, крім того, ми постійно донатимо на ЗСУ.

ЗУСТРІЧАЙТЕ СВОЮ МАЙБУТНЮ КОМАНДУ!

Ви будете працювати в Holy Water — це стартап в сфері ContentTech, який займається створенням та паблішингом книжок, аудіокнижок, інтерактивних історій та відео серіалів. Ми будуємо синергію між ефективністю AI та креативністю письменників, допомагаючи їм надихати своїм контентом десятки мільйонів користувачів у всьому світі.

HolyWater була заснована в 2020 році в екосистемі Genesis. З того часу команда зросла з 6 до 90 спеціалістів, а наші додатки неодноразово ставали лідерами в своїх категоріях в США, Австралії, Канаді та Європі.

За допомогою нашої платформи, ми даємо можливість будь-якому талановитому письменнику вийти на мільйону аудиторію користувачів наших додатків та надихати їх своїм історіями. Нашими продуктами користуються вже більше 10 мільйонів користувачів по всьому світу.

НАШІ ДОСЯГНЕННЯ ЗА 2023:

1. Наш додаток з інтерактивними історіями 3 місяці ставав топ 1 за завантаженнями у світі у своїй ніші.
2. Наш додаток з книжками, Passion, в грудні став топ 1 в своїй ніші в США та Європі.
3. Ми запустили платформу з відео серіалами на основі наших книжок та зробили перший успішний пілотний серіал.
4. Кількість нових завантажень та виручка зросли майже в 2 рази в порівнянні з 2022.

Основна цінність HolyWater
- це люди, які працюють з нами. Саме тому ми прикладаємо всі зусилля, щоб створити такі умови, де кожен співробітник зможе реалізувати свій потенціал наповну та досягнути найамбітніших цілей.

КУЛЬТУРА КОМПАНІЇ

У своїй роботі команда спирається на шість ключових цінностей: постійне зростання, внутрішня мотивація, завзятість і гнучкість, усвідомленість, свобода та відповідальність, орієнтація на результат.

Зараз ми зосереджені на масштабуванні команди та пошуку людей, які допоможуть вивести наш застосунки на нові висоти. Якщо ви смілива, працьовита, допитлива, самосвідома людина, яка не боїться робити помилки та вчитися на них, давай поспілкуємось!

Зараз ми шукаємо Junior Analytics Engineer, котрий стане частиною команди Data Engineering і буде залучений в побудову дата-платформи та керування даними.

Оскільки компанія сповідує Data Informed підхід до прийняття рішень, якість даних та антикрихкість дата-платформи є важливими пріорітетами в розбудові архітектури проекту і безпосередньо впливає на швидкість розвитку і якість прийняття продуктових рішень. В роботі використовуємо найсучасніші підходи та інструменти a.k.a Modern Data Stack.

ВАШІ ОБОВ'ЯЗКИ ВКЛЮЧАТИМУТЬ:

  • Інтеграцію third party APIs (AirByte, python).
  • Проведення дослідження даних (Exploratory Data Analysis).
  • Моделювання даних в DBT (SQL, Jinja).
  • Побудову pipelineʼів даних в Airflow (python).
  • Взаємодію із різними сервісами у екосистемі Google Cloud Platform.

ЩО ПОТРІБНО, АБИ ПРИЄДНАТИСЯ:

  • Комерційний досвід роботи на посаді Data Analyst від 6 місяців та бажання розвиватися в Data Engineering.
  • Професійні навички роботи з Python, SQL.
  • Стане перевагою досвід роботи з GCP або іншими Cloud провайдерами.
  • Відповідальність та проактивність.
  • Уважність до деталей та здібність розібратись із незнайомими даними.

ЩО МИ ПРОПОНУЄМО:

  • Можливість рости та постійно прокачувати свої навички: онлайн-бібліотека, регулярні лекції від спікерів топрівня, компенсація конференцій, тренінгів та семінарів.
  • Професійне внутрішнє ком’юніті для вашого кар’єрного розвитку (Analytics та Data Engineering).
  • Простір для втілення власних ідей та впливу на продукти.
  • Гнучкий графік роботи, можливість працювати віддалено з будь-якої безпечної точки світу, або відвідувати комфортний офіс на Подолі.
  • 20 робочих днів оплачуваної відпустки на рік, необмежена кількість лікарняних.
  • Медичне страхування.
  • Є можливість звернутися за консультацією до психолога.
  • Уся необхідна для роботи техніка.
  • У компанії ми активно застосовуємо сучасні інструменти та технології, такі як BigQuery, Tableau, Airflow, Airbyte, Terraform і DBT. Це дасть вам можливість працювати з передовими інструментами та вдосконалити свої інженерні навички.
  • Культура відкритого фідбеку.

ЕТАПИ ВІДБОРУ:

1. Первинний скринінг. Рекрутер ставить декілька запитань (телефоном або в месенджері), аби скласти враження про ваш досвід і навички перед співбесідою.
2. Тестове завдання.
Підтверджує вашу експертизу та показує, які підходи, інструменти й рішення ви застосовуєте в роботі. Ми не обмежуємо вас у часі та ніколи не використовуємо напрацювання кандидатів без відповідних домовленостей.
3. Співбесіда з менеджером.
Всеохопна розмова про ваші професійні компетенції та роботу команди, в яку подаєтесь.
4. Бар-рейзинг.
На останню співбесіду ми запрошуємо одного з топменеджерів екосистеми Genesis, який не працюватиме напряму з кандидатом. У фокусі бар-рейзера — ваші софт-скіли та цінності, аби зрозуміти, наскільки швидко ви зможете зростати разом з компанією.


Хочеш стати частиною сильної команди? Відправляй своє резюме ????.

See more jobs at Genesis

Apply for this job

11d

Senior Project Manager, CQV

CRBCary, NC, Remote
airflowDesign

CRB is hiring a Remote Senior Project Manager, CQV

Job Description

CRB is looking for an energetic, self-motivated individual for the role of CQV, Sr. Project Manager. The Commissioning, Qualification & Validation (CQV) Sr. Project Manager will provide cGMP leadership and guidance for the integration and delivery of CQV services for our Life Sciences clients. This leadership and guidance will ensure that our Clients have a Rightthe-First Time (RFT) solution that has mitigated risks and met their compliance and timeline objectives. This position requires a high level of organization, communication, and leadership. The qualified candidate should display excellent interpersonal skills to form strong relationships with internal and external clients. They will represent CRB in a professional manner, assist in winning work, and understand how their strategy directly impacts our collective success.

Responsibilities

  • Develop and formalize an ETOP package that includes flexibility for different Client approaches
  • Supports cGMP compliance-based services with Trade Partners and Clients as defined by the Project Scope 
  • Write, Review and Approve CQV documents following established 21 CFR standards, both internally and externally. Document requests could include: Standard Operating Procedures, Impact Assessments, Risk Assessments, Specifications (URS/FRS/DDS), FATs, SATs, IOQ/PQs, Validation Protocols and Commissioning Test Plans
  • Support onsite and offsite activities, such as: FATs, SATs, Executions and System Walkdowns
  • Prepare and Review reports, both internally and externally, from Trade Partners for completed CQV, CV, CSV and automation activities 
  • Review & Approve User Requirements Specifications (URS), Functional Specifications (FS), Design Specifications (DS), Change controls and equipment, and process Failure Mode and Effects Analysis (FMEA).
  • Provide investigational & troubleshooting support encountered during execution activities 
  • Provides technical training to Client staffing to enhance speed of startup activities 
  • Collaborate with Market Team Leaders to ensure RFT delivery
  • Ability to plan and execute Smoke / Airflow Visualization Studies for both Iso & Non-Iso spaces 
  • As a subject matter expert, represent these activities in discussions and communications with Clients and regulatory agencies •    Responsible for the implementation and execution of the periodic system reviews and requalification activities, as needed
  • Assess impact to validated status of new systems and changes to existing systems using a quality risk-based approach.
  • Support, train, mentor, and provide guidance to commissioning and validation specialists/engineers and project leaders in the delivery of C/Q/V services for assigned projects
  • Flexible and willing to travel as needed
  • Perform other duties as assigned

 

Qualifications

Qualifications

  • Bachelor’s Degree in Architecture, Engineering, or Construction Management, or similar degree preferred, or equivalent years of relevant industry experience.
  • 8 + years of Commissioning, Qualification, Validation and Compliance experience in the Life Sciences or Regulated Industries; Sound technical knowledge of both US and global regulatory requirements. Proficient in cGMP and SUPAC standards.
  • Demonstrated effective leadership and collaboration skills
  • Experience in collaborating and managing Commissioning, Qualification and Validation deliverables for one or Multiple Projects
  • Excellent organizational, interpersonal, presentation, and communication skills
  • Commitment to technical excellence, as well as creating world-class experiences for our clients and employees
  • Knowledge of US FDA (21 CFR 210, 211, 810) and EU EMEA regulations, ISPE Baseline Guide 5 Commissioning and Qualification and ISPE Guideline Science and Risk-based Approach for the Delivery of Facilities, Systems, and Equipment, 2011 •    Experience writing commissioning test plans, IOQ/PQ Protocols and Validation Protocols
  • Experience using statistical, risk assessment, and process improvement tools.
  • Familiarity with validation tools and processes, including environmental mapping and use of Kaye Validator
  •  


    Interpersonal and leadership skills necessary to communicate clearly, and effectively manage qualification/validation activities with all levels of personnel from various disciplines across the organization.

See more jobs at CRB

Apply for this job

12d

Data Engineer Azure

4 years of experience2 years of experienceagileBachelor's degreetableaujirascalaairflowpostgressqloracleDesignmongodbpytestazuremysqljenkinspythonAWS

FuseMachines is hiring a Remote Data Engineer Azure

Data Engineer Azure - Fusemachines - Career Page
14d

Machine Learning Platform Architect (US)

SignifydUnited States (Remote); New York City, NY
Bachelor's degreeBachelor degree5 years of experience10 years of experiencescalaairflowsqlDesignjavapythonAWS

Signifyd is hiring a Remote Machine Learning Platform Architect (US)

Who Are You

We are seeking a highly skilled and experienced ML Platform Architect to join our dynamic and growing data platform team. As an ML Platform Architect, you will play a crucial role in strengthening and expanding the core of our data products. We want you to help us scale our business, to data-driven decisions, and to contribute to our overall data strategy. You will work alongside talented data platform engineers to envision how all the data elements from multiple sources should fit together and then execute that plan. The ideal candidate must: 

  • Effectively communicate complex data problems by tailoring the message to the audience and presenting it clearly and concisely. 
  • Balance multiple perspectives, disagree, and commit when necessary to move key company decisions and critical priorities forward.
  • Have a profound comprehension of data quality, governance, and analytics.
  • Have the ability to work independently in a dynamic environment and proactively approach problem-solving.
  • Be committed to driving positive business outcomes through expert data handling and analysis.
  • Be an example for fellow engineers by showcasing customer empathy, creativity, curiosity, and tenacity.
  • Have strong analytical and problem-solving skills, with the ability to innovate and adapt to fast-paced environments.
  • Design and build clear, understandable, simple, clean, and scalable solutions.

What You'll Do

  • Modernize Signifyd’s Machine Learning (ML) Platform to scale for resiliency, performance, and operational excellence working closely with Engineering and Data Science teams across Signifyd’s R&D group.
  • Create and deliver a technology roadmap focused on advancing our data processing capabilities, which will support the evolution of our real-time data processing and analysis capabilities.
  • Work alongside ML Engineers, Data Scientists, and other Software Engineers to develop innovative big data processing solutions for scaling our core product for eCommerce fraud prevention.
  • Take full ownership of significant portions of our data processing products, including collaborating with stakeholders on machine learning models, designing large-scale data processing solutions, creating additional processing facets and mechanisms, and ensuring the support of low-latency, high-quality, high-scale decisioning for Signifyd’s flagship product.
  • Architect, deploy, and optimize Databricks solutions on AWS, developing scalable data processing solutions to streamline data operations and enhance data solution deployments.
  • Implement data processing solutions using Spark, Java, Python, Databricks, Tecton, and various AWS services (S3, Redshift, EMR, Athena, Glue).
  • Mentor and coach fellow engineers on the team, fostering an environment of growth and continuous improvement.
  • Identify and address gaps in team capabilities and processes to enhance team efficiency and success.

What You'll Need

  • Ideally has over 10 years of experience in data engineering, including at least 5 years of experience as a data or machine learning architect or lead. Have successfully navigated the challenges of working with large-scale data processing systems.
  • Deep understanding of data processing, comfortable working with multi-terabyte datasets, and skilled in high-scale data ingestion, transformation, and distributed processing, with strong Apache Spark or Databricks experience.
  • Experience in building low-latency, high-availability data stores for use in real-time or near-real-time data processing with programming languages such as Python, Scala, Java, or JavaScript/TypeScript, as well as data retrieval using SQL and NoSQL.
  • Hands-on expertise in data technologies with proficiency in technologies such as Spark, Airflow, Databricks, AWS services (SQS, Kinesis, etc.), and Kafka. Understand the trade-offs of various architectural approaches and recommend solutions suited to our needs.
  • Experience with the latest technologies and trends in Data, ML, and Cloud platforms.
  • Demonstrable ability to lead and mentor engineers, fostering their growth and development. 
  • You have successfully partnered with Product, Data Engineering, Data Science and Machine Learning teams on strategic data initiatives.
  • Commitment to quality, you take pride in delivering work that excels in data accuracy, performance and reliability, setting a high standard for the team and the organization.

#LI-Remote

Benefits in our US offices:

  • Discretionary Time Off Policy (Unlimited!)
  • 401K Match
  • Stock Options
  • Annual Performance Bonus or Commissions
  • Paid Parental Leave (12 weeks)
  • On-Demand Therapy for all employees & their dependents
  • Dedicated learning budget through Learnerbly
  • Health Insurance
  • Dental Insurance
  • Vision Insurance
  • Flexible Spending Account (FSA)
  • Short Term and Long Term Disability Insurance
  • Life Insurance
  • Company Social Events
  • Signifyd Swag

We want to provide an inclusive interview experience for all, including people with disabilities. We are happy to provide reasonable accommodations to candidates in need of individualized support during the hiring process.

Signifyd provides a base salary, bonus, equity and benefits to all its employees. Our posted job may span more than one career level, and offered level and salary will be determined by the applicant’s specific experience, knowledge, skills, and abilities, as well as internal equity and alignment with market data.

USA Base Salary Pay Range
$230,000$250,000 USD

See more jobs at Signifyd

Apply for this job

14d

Strong Junior Product Analyst at HolyWater

GenesisУкраїна Remote
tableauairflowsqlB2CFirebasepythonAWS

Genesis is hiring a Remote Strong Junior Product Analyst at HolyWater

ПІДТРИМУЄМО УКРАЇНУ ????????

Holy Water засуджує війну росії проти України й допомагає державі. На початку повномасштабної війни ми запустили продаж NFT-колекції про події в Україні, щоб зібрати 1 млн доларів на потреби української армії, а також долучилися до корпоративного благодійного фонду Genesis for Ukraine. Команда фонду закуповує необхідне спорядження, техніку й медикаменти для співробітників та їхніх родичів, що захищають країну на передовій, крім того, ми постійно донатимо на ЗСУ.

ЗУСТРІЧАЙТЕ СВОЮ МАЙБУТНЮ КОМАНДУ!

Ви будете працювати в Holy Water — це стартап в сфері ContentTech, який займається створенням та паблішингом книжок, аудіокнижок, інтерактивних історій та відео серіалів. Ми будуємо синергію між ефективністю AI та креативністю письменників, допомагаючи їм надихати своїм контентом десятки мільйонів користувачів у всьому світі.

HolyWater була заснована в 2020 році в екосистемі Genesis. З того часу команда зросла з 6 до 90 спеціалістів, а наші додатки неодноразово ставали лідерами в своїх категоріях в США, Австралії, Канаді та Європі.

За допомогою нашої платформи, ми даємо можливість будь-якому талановитому письменнику вийти на мільйону аудиторію користувачів наших додатків та надихати їх своїм історіями. Нашими продуктами користуються вже більше 10 мільйонів користувачів по всьому світу.

НАШІ ДОСЯГНЕННЯ ЗА 2023:

1. Наш додакток з інтерактивними історіями 3 місяці ставав топ 1 за завантаженнями у світі у своїй ніші.
2. Наш додаток з книжками, Passion, в грудні став топ 1 в своїй ніші в США та Європі.
3. Ми запустили платформу з відео серіалами на основі наших книжок та зробили перший успішний пілотний серіал.
4. Кількість нових завантажень та виручка зросли майже в 2 рази в порівнянні з 2022.

Основна цінність HolyWater - це люди, які працюють з нами. Саме тому ми прикладаємо всі зусилля, щоб створити такі умови, де кожен співробітник зможе реалізувати свій потенціал наповну та досягнути найамбітніших цілей.

КУЛЬТУРА КОМПАНІЇ

У своїй роботі команда спирається на шість ключових цінностей: постійне зростання, внутрішня мотивація, завзятість і гнучкість, усвідомленість, свобода та відповідальність, орієнтація на результат.

Зараз команда шукає Strong Junior Product Analyst, котрий стане новим гравцем команди аналітиків.

ВАШІ ОБОВ'ЯЗКИ ВКЛЮЧАТИМУТЬ:

  • Генерацію гіпотез росту та запуск A/B тестів разом з продуктовою командою.
  • Підтримку аналітичних процесів під час проведення A/B-тестувань для оптимізації продуктових рішень.
  • Пошук точок зростання в продукті та маркетингу.
  • Взаємодію з продакт менеджерами, розробниками та маркетологами для безпосереднього впливу на продукт.
  • Автоматизацію процесів підготовки звітів для ефективного моніторингу показників.

ЩО ПОТРІБНО, АБИ ПРИЄДНАТИСЯ:

  • Досвід роботи на посаді Data Analyst / Scientist від 1-го року.
  • Досвід роботи з column-oriented storages (BigQuery, AWS Athena, etc.).
  • Навички роботи з SQL на професійному рівні.
  • Досвід розробки та візуалізації даних техніками BI (Tableau).
  • Досвід роботи з Amplitude, Firebase, AppsFlyer.
  • Відповідальність та проактивність.
  • Проєктне та логічне мислення.

БУДЕ ПЛЮСОМ:

  • Розуміння основ Python для аналітики.
  • Досвід роботи з Google Cloud Platform.
  • Досвід роботи з B2C мобільними застосунками.

ЩО МИ ПРОПОНУЄМО:

  • Ви будете частиною згуртованої команди професіоналів, де зможете обмінюватися знаннями та досвідом, а також отримувати підтримку та поради від колег.
  • Гнучкий графік роботи, можливість працювати віддалено з будь-якої безпечної точки світу.
  • Можливість відвідувати офіс на київському Подолі. В офісах можна не турбуватися про рутину: тут на вас чекають сніданки, обіди, безліч снеків та фруктів, лаунжзони, масаж та інші переваги ????
  • 20 робочих днів оплачуваної відпустки на рік, необмежена кількість лікарняних.
  • Медичне страхування.
  • Є можливість звернутися за консультацією до психолога.
  • Уся необхідна для роботи техніка.
  • У компанії ми активно застосовуємо сучасні інструменти та технології, такі як BigQuery, Tableau, Airflow, Airbyte і DBT. Це дасть вам можливість працювати з передовими інструментами та розширити свої навички в галузі аналітики.
  • Онлайн-бібліотека, регулярні лекції від спікерів топрівня, компенсація конференцій, тренінгів та семінарів.
  • Професійне внутрішнє ком’юніті для вашого кар’єрного розвитку.
  • Культура відкритого фідбеку.

ЕТАПИ ВІДБОРУ:

1. Первинний скринінг. Рекрутер ставить декілька запитань (телефоном або в месенджері), аби скласти враження про ваш досвід і навички перед співбесідою.
2. Тестове завдання.
Підтверджує вашу експертизу та показує, які підходи, інструменти й рішення ви застосовуєте в роботі. Ми не обмежуємо вас у часі та ніколи не використовуємо напрацювання кандидатів без відповідних домовленостей.
3. Співбесіда з менеджером.
Всеохопна розмова про ваші професійні компетенції та роботу команди, в яку подаєтесь.
4. Бар-рейзинг.
На останню співбесіду ми запрошуємо одного з топменеджерів екосистеми Genesis, який не працюватиме напряму з кандидатом. У фокусі бар-рейзера — ваші софт-скіли та цінності, аби зрозуміти, наскільки швидко ви зможете зростати разом з компанією.


Якщо ви готові прийняти виклик і приєднатися до нашої команди, то чекаємо на ваше резюме!

    See more jobs at Genesis

    Apply for this job

    14d

    Especialista Cientista de dados

    ExperianBrasília, Brazil, Remote
    nosqlairflowgitdockerlinuxAWS

    Experian is hiring a Remote Especialista Cientista de dados

    Job Description

    Área: DA-Regions
    Subárea: Governance & Risk Management

    Nós construímos soluções para melhoria de nossos produtos através das técnicas de análise de dados, descoberta de conhecimento, business intelligence e modelos de machine learning para a prevenção de fraudes dos nossos produtos.

    Quais serão suas principais entregas?

    • Criação e manutenção de pipeline de dados para o Lakehouse;
    • Análise e visualização de dados para Business Intelligence;
    • Criação e implantação de modelos de Machine Learning;
    • Processamento distribuído utilizando ferramentas de Big Data.

      Qualifications

      O que estamos buscando em você!

      • Experiência de programação, principalmento com Pyhton;
      • Conhecimento em estrutura de dados, algoritmos, orientação à objetos e padrões de projeto;
      • Experiência com bancos NoSQL;
      • Experiência com ferramentas de processamento de streaming;
      • Experiência com Airflow ou outra ferramenta de workflow;
      • Experiência com AWS (EMR, SageMaker, Athena, Glue, S3);
      • Experiência com Spark;
      • Experiência com TensorFlow;
      • Experiência com Linux, Git e Docker;
      • Inglês intermediário.

       

            See more jobs at Experian

            Apply for this job

            16d

            Senior Data Engineer

            airflowpostgressqloracleDesigndockermysqlkubernetespythonAWS

            ReCharge Payments is hiring a Remote Senior Data Engineer

            Who we are

            In a world where acquisition costs are skyrocketing, funding is scarce, and ecommerce merchants are forced to do more with less, the most innovative DTC brands understand that subscription strategy is business strategy.

            Recharge is simplifying retention and growth for innovative ecommerce brands. As the #1 subscription platform, Recharge is dedicated to empowering brands to easily set up and manage subscriptions, create dynamic experiences at every customer touchpoint, and continuously evaluate business performance. Powering everything from no-code customer portals, personalized offers, and customizable bundles, Recharge helps merchants seamlessly manage, grow, and delight their subscribers while reducing operating costs and churn. Today, Recharge powers more than 20,000 merchants serving 90 million subscribers, including brands such as Blueland, Hello Bello, CrunchLabs, Verve Coffee Roasters, and Bobbie—Recharge doesn’t just help you sell products, we help build buyer routines that last.

            Recharge is recognized on the Technology Fast 500, awarded by Deloitte, (3rd consecutive year) and is Great Place to Work Certified.

            Overview

            The centralized Data and Analytics team at Recharge delivers critical analytic capabilities and insights for Recharge’s business and customers. 

            As a Senior Data Engineer, you will build scalable data pipelines and infrastructure that power internal business analytics and customer-facing data products.  Your work will empower data analysts to derive deeper strategic insights from our data, and will  enable developers to build applications that surface data insights directly to our merchants. 

            What you’ll do

            • Build data pipeline, ELT and infrastructure solutions to power internal data analytics/science and external, customer-facing data products.

            • Create automated monitoring, auditing and alerting processes that ensure data quality and consistency.

            • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models

            • Design, develop, implement, and optimize existing ETL processes that merge data from disparate sources for consumption by data analysts, business owners, and customers

            • Seek ways to continually improve the operations, monitoring and performance of the data warehouse

            • Influence and communicate with all levels of stakeholders including analysts, developers, business users, and executives.

            • Live by and champion our values: #day-one, #ownership, #empathy, #humility.

            What you’ll bring

            • Typically, 5+ years experience in a data engineering related role (Data Engineer, Data Platform Engineer, Analytics Engineer etc) with a track record of building scalable data pipeline, transformation, and platform solutions. 

            • 3+ years of hands-on experience designing and building data pipelines and models  to ingesting, transforming and delivery of large amounts of data, from multiple sources into a Dimensional (Star Schema) Data Warehouse, Data Lake.

            • Experience with a variety of data warehouse, data lake and enterprise data management platforms (Snowflake {preferred}, Redshift, databricks, MySQL, Postgres, Oracle,  RDS, AWS, GCP)

            • Experience building data pipelines, models and infrastructure powering external, customer-facing (in addition to internal business facing) analytics applications.

            • Solid grasp to data warehousing methodologies like Kimball and Inmon

            • Experience working with a variety of ETL tools. (FiveTran, dbt, Python etc)

            • Experience with workflow orchestration management engines such as Airflow & Cloud Composer

            • Hands on experience with Data Infra tools like Kubernetes, Docker

            • Expert proficiency in SQL

            • Strong Python proficiency

            • Experience with ML Operations is a plus.

            Recharge | Instagram | Twitter | Facebook

            Recharge Payments is an equal opportunity employer. In addition to EEO being the law, it is a policy that is fully consistent with our principles. All qualified applicants will receive consideration for employment without regard to status as a protected veteran or a qualified individual with a disability, or other protected status such as race, religion, color, national origin, sex, sexual orientation, gender identity, genetic information, pregnancy or age. Recharge Payments prohibits any form of workplace harassment. 

            Transparency in Coverage

            This link leads to the Anthem Blue Cross machine-readable files that are made available in response to the federal Transparency in Coverage Rule and includes network negotiated rates for all items and services; allowed amounts for OON items, services and prescription drugs; and negotiated rates and historical prices for network prescription drugs (delayed). EIN 80-6245138. This link leads to the Kaiser machine-readable files.

            #LI-Remote

            See more jobs at ReCharge Payments

            Apply for this job

            17d

            Python Engineer, Data Group

            WoltStockholm, Sweden, Remote
            airflowkubernetespython

            Wolt is hiring a Remote Python Engineer, Data Group

            Job Description

            Data at Wolt

            As the scale of Wolt has rapidly grown, we are introducing new users to our data platform every day and want this to become a coherent and streamlined experience for all users, whether they’re Analysts, Data Scientists working with our data or teams bringing new data to the platform from their applications. We aim to both provide new platform capabilities across batch, streaming, orchestration and data integration to serve our user’s needs, as well as building an intuitive interface for them to solve their use cases without having to learn the details of the underlying tools.

            In the context of this role we are hiring an experienced Senior Software Engineer to provide technical leadership and individual contribution in one the following workstreams:

            Data Governance

            Wolt’s Data Group has already developed an initial foundational tooling in the areas of data management, security, auditing, data catalog and quality monitoring, but through your technical contributions you will ensure our Data Governance tooling is state of the art. You’ll be improving the current Data Governance platform, making sure it can be further integrated with the rest of the Data Platform and Wolt Services in a scalable, secure, compliant way, without significant disruptions to the teams. 

            Data Experience

            We want to ensure our Analysts, Data Scientists, and Engineers can discover, understand, and publish high-quality data at scale. We have recently released a new data platform tool which enables simple, yet powerful creation of workflows via a declarative interface. You will help us ensure our users succeed in their work with effective and polished user experiences by developing our internal user-facing tooling and curating our documentation to the highest standards. And what's best, you get to work closely with excited users to get continuous feedback about released features while supporting and onboarding them to new workflows.

            Data Lakehouse

            We recently started this workstream to manage data integration, organization, and maintenance of our new Iceberg based data lakehouse architecture. Together, we build and maintain ingestion pipelines to efficiently gather data from diverse sources, ensuring seamless data flow. We create and manage workflows to transform raw data into structured formats, guaranteeing data quality and accessibility for analytics and machine learning purposes.

            At the time you’ll join we’ll match you with one of these work streams based on our needs and your skills, experience and preferences.

            How we work

            Our teams have a lot of autonomy and ownership in how they work and solve their challenges. We value collaboration, learning from each other and helping each other out to achieve the team’s goals. We create an environment of trust, in which everyone’s ideas are heard and where we challenge each other to find the best solutions. We have empathy towards our users and other teams. Even though we’re working in a mostly remote environment these days, we stay connected and don’t forget to have fun together building great software!

            Our tech stack

            Our primary programming language of choice is Python. We deploy our systems in Kubernetes and AWS. We use Datadog for observability (logging and metrics). We have built our data warehouse on top of Snowflake and orchestrate our batch processes with Airflow and Dagster. We are heavy users of Kafka and Kafka Connect. Our CI/CD pipelines rely on GitHub actions and Argo Workflows.

            Qualifications

            The vast majority of our services, applications and data pipelines are written in Python, so several years of having shipped production quality software in high throughput environments written in Python is essential. You should be very comfortable with typing, dependency management, unit-, integration- and end-to-end tests. If you believe that software isn’t just a program running on a machine, but the solution to someone’s problem, you’re in the right place.

            Having previous experience in planning and executing complex projects that touch multiple teams/stakeholders and run across a whole organization is a big plus.Good communication and collaboration skills are essential, and you shouldn’t shy away from problems, but be able to discuss them in a constructive way with your team and the Wolt Product team at large.

            Familiarity with parts of our tech stack is definitely a plus, but we hire for attitude and ability to learn over knowing a specific technology that can be learned.

            The tools we are building inside of the data platform ultimately serve our many stakeholders across the whole company, whether they are Analysts, Data Scientists or engineers in other teams that produce or consume data. 

            We want all of our users to love the tools we’re building and that is why we want you to focus on building intuitive and user friendly applications that enable everyone to use and work with data at Wolt.

            See more jobs at Wolt

            Apply for this job

            21d

            Architect, Machine Learning, Field CTO

            scalaairflowsqlDesignpython

            snowflakecomputing is hiring a Remote Architect, Machine Learning, Field CTO

            Build the future of data. Join the Snowflake team.

            We’re at the forefront of the data revolution, committed to building the world’s greatest data and applications platform. Our ‘get it done’ culture allows everyone at Snowflake to have an equal opportunity to innovate on new ideas, create work with a lasting impact, and excel in a culture of collaboration.

            Our Sales Engineering organization is seeking an Architect, Machine Learning Enterprise Field CTO focused on Machine Learning to join our Enterprise Field CTO team who can provide leadership in working with both technical and business executives in the design and architecture of the Snowflake Cloud Data Platform as a critical component of their enterprise data architecture and overall machine learning ecosystem.

            In this role you will work with sales teams, product management, and technology partners to leverage best practices and reference architectures highlighting Snowflake’s Cloud Data Platform as a core technology enabling platform for the emerging Data Science workload throughout an organization.

            As an Architect focused on Machine Learning you must share our passion and vision in helping our customers and partners drive faster time to insight through Snowflake’s Cloud Data Platform, thrive in a dynamic environment, and have the flexibility and willingness to jump in and get things done. You are equally comfortable in both a business and technical context, interacting with executives and talking shop with technical audiences.

            IN THIS ROLE YOU WILL GET TO:

            • Apply your data science architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners
            • Work with our product management, partners, and sales teams in order to drive innovation in our Cloud Data Platform 
            • Partner with sales team and channel partners to understand the needs of our customers,  strategize on how to navigate and accelerate winning sales cycles, provide compelling value-based enterprise architecture deliverables and working sessions to ensure customers are set up for success operationally, and support strategic enterprise pilots / proof-of-concepts 
            • Provide architectural guidance & hands-on support for strategic enterprise pilots / proof-of-concepts around Snowflake’s AI & ML capabilities. . 
            • Collaborate closely with our Product team to effectively influence the Snowflake’s product roadmaps based on field team and customer feedback
            • Partner with Product Marketing teams to spread awareness and support pipeline building via customer roundtables, conferences, events, blogs, webinars, and whitepapers

            ON DAY ONE, WE WILL EXPECT YOU TO HAVE:

            • 3+ years of experience building and deploying machine learning solutions in the cloud 
            • Deep technical hands on expertise within Data Science tools and ecosystem 
            • 2+ years of working with cloud native ML tools 
            • Strong presentation skills to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
            • Working knowledge with data engineering technologies and tools(dbt, Airflow, etc)
            • Deep level knowledge of Data Science and ML fundamentals
            • Working knowledge of deep learning concepts, techniques, and tools(Pytorch, Tensorflow, etc). 
            • 2+ years experience building and deploying ML/data engineering applications and solutions on Spark
            • Advanced knowledge of Python and popular third party packages(Pandas, Numpy, Tensorflow, sklearn, Pytorch, etc)
            • Introductory familiarity with LLM developer tools like langchain, llamaindex. 
            • Working knowledge of SQL, R, Scala and / or other scripting languages
            • Industry Focus a plus  (Education, Federal, Financial Services, Healthcare & Lifesciences, Insurance, Adv. Media, Retail CPG, Technology and Telecom)
            • Bachelor’s Degree required, Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred.

            Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. 

            How do you want to make your impact?

            Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

            See more jobs at snowflakecomputing

            Apply for this job

            22d

            Senior Data Engineer (PySpark) - (GS)

            ITScoutLATAM, AR Remote
            Bachelor's degreeairflowsqlDesignmobilepythonAWS

            ITScout is hiring a Remote Senior Data Engineer (PySpark) - (GS)

            ⚠️Open position only for people residing in Costa Rica, México, Argentina, and Brazil.⚠️


            Our client builds smart technology solutions through the combination of artificial intelligence, mobile, and web development for companies in the United States, Canada & Latam. It´s a technology company headquartered in Costa Rica. With operations throughout LATAM. Their core focus is building intelligent tech solutions to help our customers be more efficient in optimizing internal digital processes.

            About the job Senior Data Engineer (PySpark)

            Senior Data Engineer

            We are seeking a skilled Data Engineer with expertise in Python, PySpark, and Apache Airflow. This role requires proficiency in AWS services such as Redshift and Databricks, as well as strong SQL skills for data manipulation and querying.

            Responsibilities:

            • Design, develop, and maintain scalable data pipelines and workflows using Python and PySpark.
            • Implement and optimize ETL processes within Apache Airflow for data ingestion, transformation, and loading.
            • Utilize AWS services such as Redshift and Databricks for data storage, processing, and analysis.
            • Write efficient SQL queries and optimize database performance for large-scale datasets.
            • Implement version control and continuous integration using GitLab CI for maintaining codebase integrity.
            • Follow Test-Driven Development (TDD) practices to ensure code reliability and maintainability.


            Requirements:

            • Bachelor's degree in Computer Science, Engineering, or a related field.
            • 4 to 7 years of professional experience in data engineering or a related role.
            • Proficiency in Python and PySpark for building data pipelines and processing large datasets.
            • Hands-on experience with Apache Airflow for orchestrating complex workflows and scheduling tasks.
            • Strong knowledge of AWS services, including Redshift and Databricks, for data storage and processing.
            • Advanced SQL skills for data manipulation, querying, and optimization.
            • Experience with version control systems like GitLab CI for managing codebase changes.
            • Familiarity with Test-Driven Development (TDD) practices and writing unit tests for data pipelines.
            • Excellent problem-solving skills and attention to detail.
            • Strong communication and collaboration skills to work effectively within a team environment.
            • Certification in AWS or related technologies is preferred but not required.

            **English: B2+ proficiency required.


            See more jobs at ITScout

            Apply for this job

            22d

            Senior Data Engineer - (GS)

            ITScoutLATAM, AR Remote
            agiletableauairflowsqlDesignmobileapipythonAWS

            ITScout is hiring a Remote Senior Data Engineer - (GS)

            ⚠️Only available for #residents of #Latinamerica⚠️


            Our client builds smart technology solutions through the combination of artificial intelligence, mobile, and web development for companies in the United States, Canada & Latam. It´s a technology company headquartered in Costa Rica. With operations throughout LATAM. Their core focus is building intelligent tech solutions to help our customers be more efficient in optimizing internal digital processes.

            About the job Senior Data Engineer

            How will you make an impact?

            • Build and maintain multiple data pipelines to ingest new data sources (API and file-based) and support products used by both external users and internal teams.
            • Optimize by building tools to evaluate and automatically monitor data quality, develop automated scheduling, testing, and distribution of feeds.
            • Work with our data science and product management teams to design, rapid prototype, and productize new data product ideas and capabilities.
            • Work with the data engineering team to design new data pipelines and optimize the existing data pipelines.
            • Conquer complex problems by finding new ways to solve with simple, efficient approaches with a focus on reliability, scalability, quality, and cost of our platforms.
            • Build processes supporting data transformation, data structures metadata, and workload management.
            • Collaborate with the team to perform root cause analysis and audit internal and external data and processes to help answer specific business questions.

            What will you bring to us?

            • 5+ years of professional Dimensional Data Warehousing/Data Modeling and Big Data Experience
            • Strong skills to write complex, highly-optimized SQL queries across large volumes of data
            • Experience working directly with data analytics to bridge business requirements with data engineering.
            • Experience with AWS infrastructure
            • Proven experience working with Snowflake and Airflow in a production environment
            • Strong SQL skills with experience in query optimization and performance tuning
            • Proficiency in Python programming language for data processing and automation tasks.
            • Experience with DBT (Data Build Tool) for data transformation and modeling
            • Excellent troubleshooting and problem-solving skills
            • Ability to operate in an agile, entrepreneurial start-up environment, and prioritize
            • Excellent communication and teamwork, and a passion for learning
            • Curiosity and passion for data, visualization, and solving problems
            • Willingness to question the validity, accuracy of data and assumptions

            Preferred Qualifications:

            • Experience with Redshift, Snowflake, or other MPP databases is a plus.
            • Knowledge for ETL/ELT tools like Informatica, IBM DataStage, or SaaS ETL tools is a plus.
            • Experience with Tableau or other reporting tools is a plus.


            See more jobs at ITScout

            Apply for this job

            22d

            Sr. Technical Program Manager, Data Platform

            agiletableaujiraterraformairflowsketchsqlRabbitMQgitc++AWS

            hims & hers is hiring a Remote Sr. Technical Program Manager, Data Platform

            Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

            Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

            About the job:

            We're looking for an energetic and experienced Senior Technical Program Managerto join our Data Platform Engineering team. Our team is responsible for enabling H&H business (Product, Analytics, Operations, Finance, Data Science, Machine Learning, Customer Experience, Engineering) by providing a platform with a rich set of data and tools to leverage. 

            As a senior TPM within the Data Platform team, you’ll work closely with our Product and Engineering teams to understand their roadmaps, architecture, and data. You’ll also work with our consumers of the data platform to better understand their data needs. You’ll take your passion for working with people and leveraging your technical skills to move quickly, efficiently, and decisively to connect the dots and help our team deliver.

            You Will:

            • Build strong cross-functional relationships to understand our product data and the needs/uses of data
            • Create requirements and technical specifications
            • Track and manage project risks, dependencies, status, and deliverable timelines
            • Help drive Data Platform strategies
            • Work with stakeholders to understand requirements and negotiate solutions and timelines
            • Communicate and make sure the right people have the correct information in time
            • Work within our Data Platform Engineering team to help build out ticketed work and provide details to translate requirements and benefits to that work
            • Ensure the highest business value is being delivered to our stakeholders
            • Establish mechanisms to optimize team effectiveness
            • Lead prioritization meetings and status meetings with stakeholders at a regular cadence
            • Collaborate with other technical program managers to highlight dependencies with different domains
            • You will have a bias for action, a sense of urgency, and attention to detail that makes you someone your team can instinctively trust and rally behind
            • Drive continuous process improvements and best practices to create a robust, predictable, priority-driven culture
            • Collaborate with legal to ensure data privacy and compliance are followed and implemented
            • Organize and facilitate daily stand-up, sprint planning, sprint retrospectives, and backlog grooming sessions

            You Have:

            • Minimum of 8+ years experience as a data-oriented Technical Program Manager, Technical Product Manager, or Lead capacity
            • Bachelor's degree in Computer Science, Engineering, or related field, or relevant years of work experience
            • Experience working with Data Platform Engineering teams to ship scalable data products and technical roadmaps
            • Previous experience building data platforms on the cloud using Databricks or Snowflake
            • Proficiency in Jira or other project management tools
            • Knowledge of modern data stacks like Airflow, Databricks, Google BigQuery, dbt, Fivetran
            • Ability to understand different data domains and technical requirements
            • Experience collaborating with different stakeholders such as Analytics, ML, Finance, Product, Marketing, Operations, and Customer Experience teams
            • Solid understanding of data pipelines
            • Knowledge of SQL to independently investigate and test datasets, perform data validation, sketch solutions, and create basic proofs of concept
            • Experience working with management to define and measure KPIs and other operating metrics
            • Extensive experience working on Data Security and Data Governance initiatives
            • A Foundational understanding of Amazon Web Services (AWS) or Google Cloud Platform (GCP)
            • Understanding of SDLC and Agile frameworks
            • Strong project management skills with attention to detail and experience in managing multiple projects and meeting ongoing and overlapping deadlines 
            • Bias towards over-communication
            • Team player, collaborative, and positive attitude
            • Strong leadership and communication skills
            • Excellent writing, oral, and presentation skills
            • Ability to influence without authority
            • Passion for operational excellence, attention to detail, and a demonstrated ability to deliver results in a fast-paced, high-growth environmen

            Nice To Have:

            • Experience working in healthcare 
            • Previous working experience at startups
            • A basic understanding of data streaming technologies (eg, Kafka, RabbitMQ), Git, Atlan, and Terraform is a big plus 
            • Working experience with BI tools like Tableau and Looker

            Our Benefits (there are more but here are some highlights):

            • Competitive salary & equity compensation for full-time roles
            • Unlimited PTO, company holidays, and quarterly mental health days
            • Comprehensive health benefits including medical, dental & vision, and parental leave
            • Employee Stock Purchase Program (ESPP)
            • Employee discounts on hims & hers & Apostrophe online products
            • 401k benefits with employer matching contribution
            • Offsite team retreats

            #LI-Remote

            Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

            The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

            Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors. We don’t ever want the pay range to act as a deterrent from you applying!

            An estimate of the current salary range for US-based employees is
            $150,000$185,000 USD

            We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

            Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

            Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

            For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

            See more jobs at hims & hers

            Apply for this job

            23d

            Security Governance Partner - Machine Learning/Data Science, Cash App

            SquareSan Francisco, CA, Remote
            kotlintableauairflowsqlDesignjavakubernetespythonAWS

            Square is hiring a Remote Security Governance Partner - Machine Learning/Data Science, Cash App

            Job Description

            Cash Security is a security-enabling engineering organization focused on scaling security as a discipline through innovation. As a security team, Security Governance develops, implements and promotes frameworks and standards aimed at securing our customer’s data, giving privacy and security considerations a voice across the organization, and simplifying Cash’s security-related regulatory and compliance obligations.

            Governance Partners act as a bridge between the Cash App functional area they support and Leadership across security, engineering, and compliance teams to drive security enablement. The Security Governance Partner for Cash App Machine Learning and Data Science (ML/DS) works directly with the teams that build Cash App’s machine learning pipelines and infrastructure and the Cash data scientists to identify and communicate constraints and to evaluate potential solutions, while partnering closely with MLDS Security Engineering to communicate requirements and shape the security posture of the MLDS organization.

            You will:

            • Act as a security domain expert in partnership with Compliance, Legal, and Engineering
            • Collaborate deeply across roles and functions, with Security Engineering, Machine Learning Engineering and Modeling, and Data Science/Business Intelligence
            • Identify, prioritize and balance security efforts with other objectives
            • Help Cash automate governance and compliance functions and develop reusable tools for common tasks using scripting languages like Python or data tools like Prefect
            • Participate in technical design discussions, evaluate security properties of systems and services, drive risk decisions, and influence technical architecture
            • Understand challenges and roadblocks to achieving the desired security posture, and push requirements to Security Engineering to drive long-term change
            • Develop, implement and promote security standards and frameworks
            • Interpret and communicate security and compliance constraints to key stakeholders
            • Monitor applicable changes to security and privacy related laws, regulations, and industry standards, with an eye towards Generative AI and other forward-looking technologies

            Qualifications

            You have:

            • 3+ years of experience leading projects or programs in a security environment
            • 5+ years working in a security-focused role in a technology-heavy industry
            • Proficiency with at least one programming language (e.g. Python, Kotlin, Java)
            • Conceptual understanding of machine learning and data science tools and processes
            • Solid technical background with cloud computing architectures and security patterns
            • Ability to drive alignment and change in a matrixed-environment with minimal supervision
            • Boundless curiosity, persistence and a grounded approach to getting things done
            • Process orientation and an efficiency mindset to keep the organization unblocked and accountable to security
            • Working knowledge of one or more relevant compliance regulations such as SEC/FINRA, CCPA, CPRA, GDPR, PCI DSS, SOX

            Tech stack we use and teach:

            • Java and Kotlin 
            • Python
            • AWS, GCP, and Kubernetes
            • SQL, Snowflake
            • Apache Spark
            • DynamoDB, Kafka, Apache Beam and Google DataFlow
            • Tableau, Airflow, Looker, Mode, Prefect
            • Tecton
            • Jupyter notebooks 

            See more jobs at Square

            Apply for this job