airflow Remote Jobs

102 Results

+30d

Principal Software Engineer

Procore TechnologiesUS - Remote TX - Austin, TX
agileMaster’s DegreescalanosqlairflowDesignscrumjavadockerpostgresqlkubernetesjenkinspython

Procore Technologies is hiring a Remote Principal Software Engineer

Job Description

Procore’s Business Systems Technology group is looking for a Principal Software Engineer to elevate our business systems technology landscape, enhance scalability, drive operational excellence, and enable efficient growth for the business.

 

As a Principal Software Engineer, you’ll use your expert-level technical skills to craft innovative solutions while influencing and mentoring other technical leaders. You’ll collaborate with cross-functional teams and play a pivotal role to design, develop, and optimize business systems, platforms, services, integrations, and transactional data across diverse domains including finance, accounting, e-commerce, billing, payments, expenses, tax, and talent. To be successful in this role, you’re passionate about domain-driven design, systems optimization, event based integrations, configurable cloud services, with a strong bias for action and outcomes. If you’re an inspirational technology leader comfortable translating vague problems into pragmatic solutions that open up the boundaries of technical possibilities—we’d love to hear from you!

 

This role is based out of our Austin, Texas office, reports into the VP Technology of DTS Business Systems and offers flexibility to work remotely as schedule permits.

 

What you’ll do:

  • Lead the design, development, and implementation of scalable software and data solutions to meet business needs.
  • Optimize performance and scalability of existing systems to support business growth.
  • Architect and implement robust integrations between diverse systems and services.
  • Collaborate with cross-functional teams to define technical strategies, roadmaps, and drive outcome delivery.
  • Contribute to setting standards and development principles across multiple teams and the larger organization.
  • Champion best practices for software development, code reviews, and quality assurance processes.
  • Generate technical documentation and presentations to communicate architectural and design options, and educate development teams and business users.
  • Mentor and guide junior engineers to foster their growth and development.
  • Roughly 40-60% hands-on coding.

 

What we’re looking for:

  • Bachelor’s or Master’s degree in Computer Science or related field.
  • 10+ years of experience designing & implementing complex systems and business application integrations with SaaS applications (including enterprise integration patterns, middleware frameworks, SOA web services) 
  • 10+ years of demonstrated success in software development and building cloud-based, highly available, and scalable online services or streaming systems 
  • Deep understanding of micro-services architecture and containerization technologies (e.g., Docker, Kubernetes, Mesos).
  • Expertise with diverse DB technologies like RDMS PostgreSQL, Graph, NoSQL (document, columnar, key-value), Snowflake. 
  • Strength in the majority of commonly used data technologies and languages such as Python, Java, Go or Scala, Kafka, Spark, Flink, Airflow, Splunk, Datadog, Jenkins, or similar
  • Skilled in software development lifecycle processes and experience with scrum, agile and iterative approaches 
  • Excellent communication skills: Demonstrated ability to explain complex technical issues to both technical and non-technical audiences.
  • Knowledge of accounting, billing and payment processing concepts and experience with finance (ERP), billing applications and payment processors preferred

Qualifications

See more jobs at Procore Technologies

Apply for this job

+30d

Staff Data Scientist - Marketing

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Staff Data Scientist - Marketing

Job Description

The Data Science team at Cash App derives valuable insights from our extremely unique datasets and turns those insights into actions that improve the experience for our customers every day. As a Marketing Data Scientist, you will play a critical role in accelerating Cash App’s growth by creating and improving how we measure the impact of all our marketing efforts.

In this role, you’ll be embedded in our Marketing organization and work closely with marketers, product management as well as other cross-functional partners to make effective spend decisions across marketing channels, understand the impact of incentives and explore new opportunities to enable Cash App to become the top provider of primary banking services to our customers.

You will:

  • Build models to optimize our marketing efforts to ensure our spend has the best possible ROI
  • Design and analyze A/B experiments to evaluate the impact of marketing campaigns we launch
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the Marketing product team and other key stakeholders
  • Partner directly with the Cash App Marketing org to influence their roadmap and define success metrics to understand the impact to business,
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior & segments
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • A bachelor degree in statistics, data science, or similar STEM field with 7+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 5+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Worked extensively with Causal Inference techniques and off platform data
  • A knack for turning ambiguous problems into clear deliverables and actionable insights 
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

Genesis is hiring a Remote User Acquisition Specialist (Paid Social) at HolyWater

See more jobs at Genesis

Apply for this job

+30d

Data Integration Engineer (Req #1713)

Clover HealthRemote - USA
Master’s DegreetableauairflowpostgressqlDesignc++python

Clover Health is hiring a Remote Data Integration Engineer (Req #1713)

Location: 3401 Mallory Lane, Suite 210, Franklin, TN 37067; Telecommuting
permissible from any location in the U.S.


Salary Range: $132,974 /yr - $161,250 /yr


Job Description: Create and manage ETL packages, triggers, stored procedures, views,
SQL transactions. Develop new secure data feeds with external parties as well as internal
applications including the data warehouse and business intelligence applications. Perform
analysis and QA. Diagnose ETL and database related issues, perform root cause analysis,
and recommend corrective actions to management. Work with a small project team to
support the design, development, implementation, monitoring, and maintenance of new
ETL programs. Telecommuting is permissible from any location in the US. 

Requirements: Bachelor’s degree or foreign degree equivalent in Computer Science,
Information Systems or related field and five (5) years of progressive, post-baccalaureate
experience in IT development or in the job offered or related role. Alternatively,
employer will accept a Master’s degree or foreign equivalent in Computer Science,
Information Systems or a related field and two (2) years of experience in IT development
or in the job offered or a related role. Any suitable combination of education, experience,
or training is acceptable.

Skills: Experience and/or education must include: 

1.  Python & Postgres; 
2.  Snowflake, DBT, Airflow, Big Query, Data Governance; 
3. Analytics, data science through SQL Optimization; 
4.  Database Design Modeling; and 
5.  ML Collaboration tools such as Tableau, Mode, and Looker.

 

#LI-DNI

See more jobs at Clover Health

Apply for this job

+30d

Data Engineer - Senior 0010ALIS - 151

Global InfoTek, Inc.Huntsville, AL Remote
agilejiraairflowsqlDesigndockerelasticsearchpostgresqlkubernetesAWSjavascript

Global InfoTek, Inc. is hiring a Remote Data Engineer - Senior 0010ALIS - 151

Clearance Level:TS/SCI

US Citizenship: Required

Job Classification: Full-time

Location: District of Columbia

Experience:5-7 years

Education: Masters or equivalent experience in a related field.

As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain, and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions.

Responsibilities:

  • Bullets of responsibilities of the role, examples:
  • Define and communicate a clear product vision for our client’s software products, aligning with user needs and business objectives.
  • Create and manage product roadmaps that reflect both innovation and growth strategies.
  • Partner with a government product owner and a product team of 7-8 FTEs.
  • Develop and design data pipelines to support an end-to-end solution.
  • Develop and maintain artifacts (e.g. schemas, data dictionaries, and transforms related to ETL processes).
  • Integrate data pipelines with AWS cloud services to extract meaningful insights.
  • Manage production data within multiple datasets ensuring fault tolerance and redundancy.
  • Design and develop robust and functional dataflows to support raw data and expected data.
  • Provide Tier 3 technical support for deployed applications and dataflows.
  • Collaborate with the rest of data engineering team to design and launch new features.
  • Coordinate and document dataflows, capabilities, etc.
  • Occasionally (as needed) support to off-hours deployment such as evening or weekends.

Qualifications:

  • Understanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S).
  • Familiar with Amazon Web Managed Services (AWS).
  • Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
  • Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML.
  • Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis.
  • Familiar with Linux/Unix server environments.
  • Experience with Agile development methodology.
  • Publishing and/or presenting design reports.
  • Coordinating with other team members to reach project milestones and deadlines.
  • Working knowledge with Collaboration tools, such as, Jira and Confluence.

Preferred Qualifications:

  • Familiarity and experience with the Intelligence Community (IC), and the intel cycle.
  • Familiarity and experience with the Department of Homeland Security (DHS).
  • Direct Experience with DHS and Intelligence Community (IC) component's data architectures and environments (IC-GovCloud experience preferred).
  • Experience with cloud message APIs and usage of push notifications.
  • Keen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national security.
  • Working knowledge with public keys and digital certificates.
  • Experience with DevOps environments.
  • Expertise in various COTS, GOTS, and open-source tools which support development of data integration and visualization applications.
  • Experience with cloud message APIs and usage of push notifications.
  • Specialization in Object Oriented Programming languages, scripting, and databases.

Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or based on disability.

About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.

See more jobs at Global InfoTek, Inc.

Apply for this job

+30d

Senior Analytics Engineer

RemoteRemote-Europasia
airflowsqlDesignjenkinspython

Remote is hiring a Remote Senior Analytics Engineer

About Remote

Remote is solving global remote organizations’ biggest challenge: employing anyone anywhere compliantly. We make it possible for businesses big and small to employ a global team by handling global payroll, benefits, taxes, and compliance. Check out remote.com/how-it-works to learn more or if you’re interested in adding to the mission, scroll down to apply now.

Please take a look at remote.com/handbook to learn more about our culture and what it is like to work here. Not only do we encourage folks from all ethnic groups, genders, sexuality, age and abilities to apply, but we prioritize a sense of belonging. You can check out independent reviews by other candidates on Glassdoor or look up the results of our candidate surveys to see how others feel about working and interviewing here.

All of our positions are fully remote. You do not have to relocate to join us!

What this job can offer you

This is an exciting time to join the growing Data Team at Remote, which today consists of over 15 Data Engineers, Analytics Engineers and Data Analysts spread across 10+ countries. Throughout the team we're focused on driving business value through impactful decision making. We're in a transformative period where we're laying the foundations for scalable company growth across our data platform, which truly serves every part of the Remote business. This team would be a great fit for anyone who loves working collaboratively on challenging data problems, and making an impact with their work. We're using a variety of modern data tooling on the AWS platform, such as Snowflake and dbt, with SQL and python being extensively employed.

What you bring

  • 3+ years of experience in analytics engineering; high-growth tech company experience is a plus
  • Strong experience using data transformation frameworks (e.g. dbt) and data warehouses (e.g. Redshift), strong proficiency in SQL
  • Strong knowledge in data modelling techniques (Kimball, Data Vault, etc)
  • Solid Experience with data visualization tools (e.g. Metabase)
  • Strong affinity towards well crafted software - testing, knowledge of best practices, experience with CI/CD (e.g. Gitlab, Github, Jenkins)
  • A self-starter mentality and the ability to thrive in an unstructured and fast-paced environment
  • Proven collaboration and communication skills
  • Experience in dealing with ambiguity, working together with stakeholders on taking abstract concepts and turning them into data models that can answer a variety of questions
  • Writes and speaks fluent English
  • It's not required to have experience working remotely, but considered a plus

Key Responsibilities

  • DBT Modelling:
    • Design, develop, and maintain dbt (Data Build Tool) models for data transformation and analysis, providing clean and reliable data to end users enabling them to get accurate and consistent answers by self-serving on BI tools.
    • Collaborate with Data Analysts and Business Stakeholders to understand their reporting and analysis needs and translate them into DBT models.
    • Own our internal dbt conventions and best practices, keeping our code-base clean and efficient (including code reviews for peers).
  • Data Analytics & Monitoring:
    • Ensure data quality and consistency by implementing data testing, validation and cleansing techniques.
    • Implement monitoring solutions to track the health and performance of the data present in our warehouse.
    • Train business users on how to use data visualisation tools.
  • Drive our Culture of Documentation:
    • Create and maintain data documentation & definitions, including data dictionaries and process flows.
    • Collaborate with cross-functional teams, including Data Analysts, Business stakeholders, to understand their data requirements and deliver effective data solutions.
    • Share knowledge and provide guidance to peers, creating an environment that empowers collective growth.

Practicals

  • You'll report to: Engineering Manager - Data
  • Team: Data Engineering
  • Location: Anywhere in the World, but we will prioritise candidates from APAC countries to ensure the diversity
  • Start date: As soon as possible

Compensation Philosophy 

Remote's Total Rewards philosophy is to ensure fair, unbiased compensation and fair equity pay along with competitive benefits in all locations in which we operate. We do not agree to or encourage cheap-labor practices and therefore we ensure to pay above in-location rates. We hope to inspire other companies to support global talent-hiring and bring local wealth to developing countries.

At first glance our salary bands seem quite wide - here is some context. At Remote we have international operations and a globally distributed workforce. We use geo ranges to consider geographic pay differentials as part of our global compensation strategy to remain competitive in various markets while we hiring globally.

The base salary range for this full-time position is between $42,750 USD to $96,200 USD. Our salary ranges are determined by role, level and location, and our job titles may span more than one career level. The actual base pay for the successful candidate in this role is dependent upon many factors such as location, transferable or job-related skills, work experience, relevant training, business needs, and market demands. The base salary range may be subject to change.

Application process

  1. (async) Profile review
  2. Interview with recruiter
  3. Interview with future manager
  4. (async) Small challenge
  5. (async) Challenge Review
  6. Interview with team members (no managers present)
  7. Prior employment verification check(s)
  8. (async) Offer

#LI-DP

Benefits

Our full benefits & perks are explained in our handbook at remote.com/r/benefits. As a global company, each country works differently, but some benefits/perks are for all Remoters:
  • work from anywhere
  • unlimited personal time off (minimum 4 weeks)
  • quarterly company-wide day off for self care
  • flexible working hours (we are async)
  • 16 weeks paid parental leave
  • mental health support services
  • stock options
  • learning budget
  • home office budget & IT equipment
  • budget for local in-person social events or co-working spaces

How you’ll plan your day (and life)

We work async at Remote which means you can plan your schedule around your life (and not around meetings). Read more at remote.com/async.

You will be empowered to take ownership and be proactive. When in doubt you will default to action instead of waiting. Your life-work balance is important and you will be encouraged to put yourself and your family first, and fit work around your needs.

If that sounds like something you want, apply now!

How to apply

  1. Please fill out the form below and upload your CV with a PDF format.
  2. We kindly ask you to submit your application and CV in English, as this is the standardised language we use here at Remote.
  3. If you don’t have an up to date CV but you are still interested in talking to us, please feel free to add a copy of your LinkedIn profile instead.

We will ask you to voluntarily tell us your pronouns at interview stage, and you will have the option to answer our anonymous demographic questionnaire when you apply below. As an equal employment opportunity employer it’s important to us that our workforce reflects people of all backgrounds, identities, and experiences and this data will help us to stay accountable. We thank you for providing this data, if you chose to.

See more jobs at Remote

Apply for this job

+30d

Senior Data Engineer (Taiwan)

GOGOXRemote
airflowsqlazureapijavapythonAWS

GOGOX is hiring a Remote Senior Data Engineer (Taiwan)

Senior Data Engineer (Taiwan) - GoGoX - Career Page

See more jobs at GOGOX

Apply for this job

+30d

Staff Data Scientist - Controls/Access

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Staff Data Scientist - Controls/Access

Job Description

The Data Science team at Cash App derives valuable insights from our extremely unique datasets and turns those insights into actions that improve the experience for our customers every day. In this role, you’ll be embedded in our Health organization and work closely with product management as well as other cross-functional partners to drive meaningful change that helps protect our customers and their money. Because our Health DS team plays such a critical role in building and maintaining trust with our users, an appreciation for the connection between your work and the experience it delivers to customers is absolutely critical for this position.

As a Data Scientist, you will:

  • Partner directly with the Cash App Health org, working closely with operations, engineers, legal and compliance, and machine learning teams
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the product team and other key stakeholders
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior
  • Design and analyze A/B experiments to evaluate the impact of changes we make to the product
  • Work with engineers to log new, useful data sources as we build new product features
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • Previous exposure to or interest in areas like anomaly detection or regulatory data science 
  • A bachelor degree in statistics, data science, or similar STEM field with 4+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 2+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

+30d

Senior Data Engineer

EquipmentShareRemote; Chicago; Denver; Kansas City; Columbia MO
agileairflowsqlDesignc++postgresqlpythonAWS

EquipmentShare is hiring a Remote Senior Data Engineer

EquipmentShare is Hiring a Senior Data Engineer.

Your role in our team

At EquipmentShare, we believe it’s more than just a job. We invest in our people and encourage you to choose the best path for your career. It’s truly about you, your future, and where you want to go.

We are looking for a Senior Data Engineer to help us continue to build the next evolution of our data platform in a scalable, performant, and customer-centric architecture.

Our main tech stack includes Snowflake, Apache Airflow, AWS cloud infrastructure (e.g., Kinesis, Kubernetes/EKS, Lambda, Aurora RDS PostgreSQL), Python and Typescript.

What you'll be doing

We are typically organized into agile cross-functional teams composed of Engineering, Product, and Design, which allows us to develop deep expertise and rapidly deliver high-value features and functionality to our next-generation T3 Platform.

You’ll be part of a close-knit team of data engineers developing and maintaining a data platform built with automation and self-service in mind to support analytics and machine learning data products for the next generation of our T3 Fleet that enableend-users to track, monitor and manage the health of their connected vehicles and deployed assets. 

We'll be there to support you as you become familiar with our teams, product domains, tech stack and processes — generally how we all work together.

Primary responsibilities for a Senior Data Engineer

  • Collaborate with Product Managers, Designers, Engineers, Data Scientists and Data Analysts to take ideas from concept to production at scale.
  • Design, build and maintain our data platform to enable automation and self-service for data scientists, machine learning engineers and analysts.
  • Design, build and maintain data product framework to support EquipmentShare application data science and analytics features.
  • Design, build and maintain CI/CD pipelines and automated data and machine learning deployment processes.
  • Develop data monitoring and alerting capabilities.
  • Document architecture, processes and procedures for knowledge sharing and cross-team collaboration.
  • Mentor peers to help them build their skills.

Why We’re a Better Place to Work

We can promise that every day will be a little different with new ideas, challenges and rewards.

We’ve been growing as a team and we are not finished just yet— there is plenty of opportunity to shape how we deliver together.

Our missionis to enable the construction industry with tools that unlock substantial increases to productivity. Together with our team and customers, we are building the future of construction.

T3is the only cloud-based operating system that brings together construction workflows & data from constantly moving elements in one place.

  • Competitive base salary and market leading equity package.
  • Unlimited PTO.
  • Remote first.
  • True work/life balance.
  • Medical, Dental, Vision and Life Insurance coverage.
  • 401(k) + match.
  • Opportunities for career and professional development with conferences, events, seminars and continued education.
  • On-site fitness center at the Home Office in Columbia, Missouri, complete with weightlifting machines, cardio equipment, group fitness space, racquetball courts, a climbing wall, and much more!
  • Volunteering and local charity support that help you nurture and grow the communities you call home through our Giving Back initiative.
  • Stocked breakroom and full kitchen with breakfast and lunch provided daily by our chef and kitchen crew.

About You

You're a hands-on developer who enjoys solving complex problems and building impactful solutions.  Most importantly, you care about making a difference.

  • Take the initiative to own outcomes from start to finish — knowing what needs to be accomplished within your domain and how we work together to deliver the best solution.
  • You are passionate about developing your craft — you understand what it takes to build quality, robust and scalable solutions.
  • You’ll see the learning opportunity when things don’t quite go to plan — not only for you but for how we continuously improve as a team.
  • You take a hypothesis-driven approach — knowing how to source, create and leverage data to inform decision making, using data to drive how we improve, to shape how we evaluate and make platform recommendations.

So, what is important to us?

Above all, you’ll get stuff done. More importantly, you’ll collaborate to do the right things in the right wayto achieve the right outcomes.

  • 7+ years of relevant data platform development experience building production-grade solutions.
  • Proficient with SQL and a high-order object-oriented language (e.g., Python).
  • Experience with designing and building distributed data architecture.
  • Experience building and managing production-grade data pipelines using tools such as Airflow, dbt, DataHub, MLFlow
  • Experience building and managing production-grade data platforms using distributed systems such as Kafka, Spark, Flink and/or others.
  • Familiarity with event data streaming at scale.
  • Proven track record learning new technologies and applying that learning quickly.
  • Experience building observability and monitoring into data products. 
  • Motivated to identify opportunities for automation to reduce manual toil.

EquipmentShare is committed to a diverse and inclusive workplace. EquipmentShare is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.

 

#LI-Remote

 

See more jobs at EquipmentShare

Apply for this job

+30d

Google Pillar | Mid/Senior Data Engineer

DevoteamLisbon, Portugal, Remote
Bachelor degreeairflowsqljavapython

Devoteam is hiring a Remote Google Pillar | Mid/Senior Data Engineer

Job Description

Our Devoteam G Cloud is looking for Google Cloud Data Engineers to join our Google Cloud Platform specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam;
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP.

Qualifications

  • Bachelor degree in IT or similar;
  • More than 3 years of professional experience, with expertise in the delivery of Data Engineering projects;
  • GCP Data Services, BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Knowledge of programing languages: Python, Java, or SQL;
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have);
  • Proficiency in English (written and spoken).

See more jobs at Devoteam

Apply for this job

+30d

Data Engineer 3

agilescalaairflowsqloracleDesignazuregitc++mysqlpython

Blueprint Technologies is hiring a Remote Data Engineer 3

Who is Blueprint?

We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

What does Blueprint do?

Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

Why Blueprint?

At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

What will I be doing?

Job Summary:

The Data Engineer’s responsibilities include designing, developing, and deploying Data Integration (ETL and or ELT) solutions using agreed upon design patterns and technologies; working with a large variety of data sources from json, csv, Oracle, SQL Server, Azure Synapse, Azure Analysis Services, Azure SQL DB, DataLake, polybase; and Streaming data sets.

 

Supervisory Responsibilities:

  • None

 

Duties/Responsibilities:

  • Create workflows, templates, and design patterns
  • Communicate with stakeholders to obtain accurate business requirements
  • Create and perform unit tests for solutions
  • Converting existing SSIS packages into Azure Data Factory Pipelines
  • Performs other related duties as assigned

 

Required Skills/Abilities:

•    Familiarity of SQL programming language basic fundamentals
•    Familiarity of Basic understanding use of Python or R or Scala
•    Awareness of Python/Synapse/Snowflake Distributed/Parallel Computing
•    Familiarity of Basic understanding of modeling tools such as ERWin, Dbeaver, Lucid, Visio 
•    Awareness of Git for version control of code repositories
•    Awareness of RDBMS Development tools. SQL Enterprise Manager, Visual Studio, Azure Data Studio.
•    Awareness of Open-source database platforms. MySQL
•    Awareness of Big Data frameworks such as Pyspark, Hadoop, etc
•    Familiarity of Modern Data Estate patterns: Source to Raw To Stage To Curated
•    Familiarity of Databricks concepts: batch, streaming, autoloader, etc.
•    Familiarity of Cloud diagnostics, logging, and performance monitoring/tuning.
•    Familiarity of Understanding of data shoveling tools: ADF, Fivetran, Airflow, etc..

•    Familiarity of Database i/o skills -- writing/reading structured and unstructured DBs
•    Awareness of Debugging, documentation, testing, and optimization skills
•    Awareness of Convert business needs into technical requirements
•    Experience Explaining DE concepts to business stakeholders
•    Experience Communicating and collaborating effectively with a remote team
•    Experience Communicating effectively with interdisciplinary teams of various technical skill levels
•    Learning Working effectively and delivering value in ambiguous settings

Education and Experience:

  • Bachelor’s degree in Computer Science, Industrial Engineering, Business Analytics, or equivalent.
  • 3+ years of broad-based IT experience with technical knowledge of data Integration (ETL, ELT) technologies and approaches, Data Warehousing, Data Lake Methodologies.
  • 3+ years’ experience with SQL Server. Expert level TSQL knowledge required.
  • 3+ years’ experience designing and implementing scalable ETL processes including data movement (SSIS, replication, etc.) and quality tools.
  • 2+ years’ experience building cloud hosted data systems. Azure preferred.
  • 2+ years’ experience with SQL Server Analysis Services (SSAS)
  • 2+ years’ experience with SQL Server Integration Services (SSIS)

 

Physical Requirements:

  • The employee is frequently required to sit at a workstation for extended and long periods of time. The employee will occasionally walk; will frequently use hands to finger, handle, grasp or feel; and reach with hands, wrists, or arms in repetitive motions.
  • The employee will frequently use fingers for manipulation of computers (laptop and desktops) and telephone equipment including continuous 10-key, handwriting, use of mouse (or alternative input device), use of keyboard (or alternative input device), or sporadic 10-Key, telephone, or telephonic headsets. This position will also frequently use other office productivity tools such as the printer/scanner.
  • Role requires the ability to lift, carry, push, pull and/or move up to 10lbs on a frequent basis and may require twisting actions of the upper body (i.e., Picking up, carrying a laptop – and twist to work on an L shape desk).
  • Specific vision abilities required by this job include close vision, distance vision, peripheral vision, depth perception, and ability to adjust focus. This position requires frequent use of a computer monitor and visual acuity to perform email responses, prepare and analyze data; transcribe; extensive reading and online communication.
  • Role requires being able to hear and use verbal communication for interactions with internal clients and dependent on role with external clients via conference calls.

 

Cognitive Ability Requirements: The employee must have the ability to:

  • Works with others (co-workers, professionals, public, customers, clients)
  • Works professionally in alignment with the organization’s code of conduct
  • Interact face to face with others (co-workers, superiors)
  • Constant verbal and email communication with others (co-workers, supervisors, vendors, client, customers etc.) to exchange information
  • Ability to take constructive feedback and show courtesy to co-workers, professionals, public, customers, clients
  • Make quick, accurate decisions without supervision
  • Evaluate or make decisions based on experience or knowledge
  • Divide attention between issues requiring multi-tasking
  • Use judgment on routine matters
  • Distinguish situations requiring judgment and adaptation of procedures from one task to another
  • Adapt to tightly scheduled and hurried pace of work activities
  • Meet frequent project deadlines
  • Organize own work
  • Ask questions or request assistance when needed
  • Follow instructions received both orally and in writing

Work Environment:

  • The work environment is usually a traditional office, indoor setting with no exposure to outside elements
  • This position requires no travel    
  • The employee will frequently be required to work closely with others and occasionally work alone
  • This position may require a work schedule across weekends and holidays
  • This position is subject to blackout dates which may include holidays where PTO is not approved
  • May work remotely based on adherence to the organizations work from home policy
  • Reasonable accommodations may be made to enable individuals with disabilities to perform the job

Salary Range

Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $88,300 - $115,300 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

Equal Opportunity Employer

Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

  • Medical, dental, and vision coverage
  • Flexible Spending Account
  • 401k program
  • Competitive PTO offerings
  • Parental Leave
  • Opportunities for professional growth and development

 

Location: Remote - USA

See more jobs at Blueprint Technologies

Apply for this job

+30d

Data Engineering Development Manager

agilescalaairflowsqlDesignazuregitc++pythonAWS

Blueprint Technologies is hiring a Remote Data Engineering Development Manager

Who is Blueprint?

We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

What does Blueprint do?

Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

Why Blueprint?

At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

We are looking for a Data Engineering Development Manager to join us as we build cutting-edge technology solutions!  This is your opportunity to be part of a team that is committed to delivering best in class service to our customers.

 In this role, you will lead and mentor a remote team of highly skilled data engineers, overseeing their development plans and performance reviews. Beyond managerial responsibilities, this role demands a hands-on approach, with an expected 50% involvement in coding for Data Engineer projects. This hands-on engagement serves as a model for your team, showcasing your commitment to technical excellence. Join us on this exciting journey where your leadership and technical acumen will play a vital role in shaping the success of our technology solutions.

Responsibilities:

Supervisory Responsibilities

  • Interview, hire, and train new staff.
  • Oversee the training and development of the Data Engineer team.
  • Provide constructive and timely performance evaluations.
  • Handle discipline and termination of employees in accordance with company policy.

Duties/Responsibilities

  • Make high-level architecture decisions and execute them, with the ability to explain and defend those decisions to all stakeholders, both internal and external.
  • Plan and execute successful complex technical projects in an Agile process.
  • Model, query, optimize, and analyze large, business-critical datasets.
  • Host design and code reviews.
  • Collaborate on project plans, deliverables, and timeline estimates for Data Engineer projects.
  • Identify resourcing requirements for Data Engineer projects.
  • Participate in all stages of implementation, from early brainstorming to design, coding, and bug fixing.
  • Evaluate and identify use cases for new technologies.
  • Drive vision and alignment internally and across external stakeholders.
  • Comfortably speak to patterns and best practices for Data Engineering teams.

Qualifications:

Technical Skills Foundation

  • Proficient in Python, R, or Scala with a fundamental understanding of their use.
  • Skilled with Cloud technologies such as Azure, AWS, GCP, Snowflake.
  • Skilled with Big Data frameworks such as PySpark, Hadoop, etc.
  • Skilled with Open-source database platforms, particularly MySQL.
  • Skilled with Git for version control of code repositories.
  • Proficient in modeling tools such as ERWin, DBeaver, Lucid, SQLDBM, or Visio.
  • Skilled with RDBMS Development tools: SQL Enterprise Manager, Visual Studio, Azure Data Studio.

Data Processing and Management

  • Skilled with Modern Data Estate patterns: Medallion architecture.
  • Skilled with Databricks concepts: batch, streaming, autoloader, etc.
  • Skilled with Cloud diagnostics, logging, and performance monitoring/tuning.
  • Skilled with understanding data shoveling tools: ADF, Fivetran, Airflow, etc.
  • Skilled with Data Governance concepts and tools.
  • Skilled with Data rule and Business rule application (schema vs Great expectations).
  • Skilled with CI/CD.

Advanced Data Engineering

  • Expert in Data wrangling skills with csv, tsv, parquet, and json files.
  • Expert in Database I/O skills -- writing/reading structured and unstructured DBs.
  • Expert in Debugging, documentation, testing, and optimization skills.
  • Expert in explaining DE concepts to business stakeholders.
  • Skilled with providing hands-on-code support for blocked/struggling team members.

Databricks Expertise

  • Proficiency in configuring and fine-tuning Databricks settings for optimal performance and resource utilization.
  • Experience in managing the Unity Catalog within Databricks, ensuring efficient organization and retrieval of metadata.
  • Competency in implementing robust access control measures within Databricks to safeguard data and maintain compliance.
  • Expertise in scheduling and monitoring jobs within Databricks, ensuring timely and accurate execution.
  • Proficiency in configuring security settings within Databricks to protect sensitive data and maintain a secure environment.

Leadership and Collaboration

  • Strong people skills, ability to manage multiple tasks and projects, and operate within ambiguity.
  • Skilled in working effectively and delivering value in ambiguous settings.
  • Skilled in communicating and collaborating effectively with a remote team.
  • Skilled in communicating effectively with interdisciplinary teams of various technical skill levels.
  • Expert in communicating effectively with leadership and executives.
  • Expert in defining incremental deliverables to deliver value quickly and iterate.
  • Expert in prioritizing new projects/features in accordance with LOE and potential value.
  • Skilled in establishing short and long-term vision/goals for the team.
  • Skilled in establishing policies and principles for the team.
  • Expert in converting business needs into technical requirements.
  • Skilled in mentoring other Data Engineers.
  • Skilled with interviewing and selecting new team members according to the needs of the team.
  • Skilled in working with internal groups such as Marketing and Sales on collaborative strategies.
  • Skilled in contributing to presales conversations with prospective clients.

Preferred Qualifications:

  • Experience with Azure required; AWS strongly preferred.

Salary Range

Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $164,900 to $207,200 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

Equal Opportunity Employer

Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

  • Medical, dental, and vision coverage
  • Flexible Spending Account
  • 401k program
  • Competitive PTO offerings
  • Parental Leave
  • Opportunities for professional growth and development

Location:Remote

See more jobs at Blueprint Technologies

Apply for this job

+30d

Data Engineer - Senior 0010 ALITSS - 151

Global InfoTek, Inc.Huntsville, AL Remote
agilejiraairflowsqlDesigndockerelasticsearchpostgresqlkubernetesAWSjavascript

Global InfoTek, Inc. is hiring a Remote Data Engineer - Senior 0010 ALITSS - 151

Clearance Level:TS/SCI

US Citizenship: Required

Job Classification: Full-time

Location: District of Columbia

Experience:5-7 years

Education: Masters or equivalent experience in a related field.

As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain, and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions.

Responsibilities:

  • Bullets of responsibilities of the role, examples:
  • Define and communicate a clear product vision for our client’s software products, aligning with user needs and business objectives.
  • Create and manage product roadmaps that reflect both innovation and growth strategies.
  • Partner with a government product owner and a product team of 7-8 FTEs.
  • Develop and design data pipelines to support an end-to-end solution.
  • Develop and maintain artifacts (e.g. schemas, data dictionaries, and transforms related to ETL processes).
  • Integrate data pipelines with AWS cloud services to extract meaningful insights.
  • Manage production data within multiple datasets ensuring fault tolerance and redundancy.
  • Design and develop robust and functional dataflows to support raw data and expected data.
  • Provide Tier 3 technical support for deployed applications and dataflows.
  • Collaborate with the rest of data engineering team to design and launch new features.
  • Coordinate and document dataflows, capabilities, etc.
  • Occasionally (as needed) support to off-hours deployment such as evening or weekends.

Qualifications:

  • Understanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S).
  • Familiar with Amazon Web Managed Services (AWS).
  • Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
  • Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML.
  • Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis.
  • Familiar with Linux/Unix server environments.
  • Experience with Agile development methodology.
  • Publishing and/or presenting design reports.
  • Coordinating with other team members to reach project milestones and deadlines.
  • Working knowledge with Collaboration tools, such as, Jira and Confluence.

Preferred Qualifications:

  • Familiarity and experience with the Intelligence Community (IC), and the intel cycle.
  • Familiarity and experience with the Department of Homeland Security (DHS).
  • Direct Experience with DHS and Intelligence Community (IC) component's data architectures and environments (IC-GovCloud experience preferred).
  • Experience with cloud message APIs and usage of push notifications.
  • Keen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national security.
  • Working knowledge with public keys and digital certificates.
  • Experience with DevOps environments.
  • Expertise in various COTS, GOTS, and open-source tools which support development of data integration and visualization applications.
  • Experience with cloud message APIs and usage of push notifications.
  • Specialization in Object Oriented Programming languages, scripting, and databases.

Global InfoTek, Inc. is an equal-opportunity employer. All qualified applicants will receive consideration for employment regardless of race, color, religion, sex, sexual orientation, gender identity, or national origin.

About Global InfoTek, Inc. Reston, VA-based Global InfoTek Inc. is a woman-owned small business with an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation’s pressing cyber and advanced technology needs. For more than two decades, GITI has merged pioneering technologies, operational effectiveness, and best business practices to rapidly.

See more jobs at Global InfoTek, Inc.

Apply for this job

+30d

Sr. Data Engineer - AWS & Databricks

agile5 years of experiencescalaairflowDesignc++pythonAWS

Blueprint Technologies is hiring a Remote Sr. Data Engineer - AWS & Databricks

Who is Blueprint?

We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

What does Blueprint do?

Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

Why Blueprint?

At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

We are looking for a Sr. Data Engineer – AWS & Databricks to join us as we build cutting-edge technology solutions!  This is your opportunity to be part of a team that is committed to delivering best in class service to our customers.

 In this role will play a crucial role in designing, developing, and maintaining robust data infrastructure solutions, ensuring the efficient and reliable flow of data across our organization. If you are passionate about data engineering, have a strong background in AWS and Databricks, and thrive in a collaborative and innovative environment, we want to hear from you.

Responsibilities:

  • Design, implement, and maintain scalable data architectures that supports our client’s data processing and analysis needs.
  • Collaborate with cross-functional teams to understand data requirements and translate them into efficient and effective data pipeline solutions.
  • Develop, optimize, and maintain ETL (Extract, Transform, Load) processes to ensure the timely and accurate movement of data across systems.
  • Implement best practices for data pipeline orchestration and automation using tools like Apache Airflow.
  • Leverage AWS services, such as S3, Redshift, Glue, EMR, and Lambda, to build and optimize data solutions.
  • Utilize Databricks for big data processing, analytics, and machine learning workflows.
  • Implement data quality checks and ensure the integrity and accuracy of data throughout the entire data lifecycle.
  • Establish and enforce data governance policies and procedures.
  • Optimize data processing and query performance for large-scale datasets within AWS and Databricks environments.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and provide the necessary infrastructure.
  • Document data engineering processes, architecture, and configurations.

Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 5 years of experience in data engineering roles, with a focus on AWS and Databricks.
  • Proven expertise in AWS services (S3, Redshift, Glue, EMR, Lambda) and Databricks.
  • Strong programming skills in languages such as Python, Scala, or Java.
  • Experience with data modeling, schema design, and database optimization.
  • Proficiency in using data pipeline orchestration tools (e.g., Apache Airflow).
  • Familiarity with version control systems and collaboration tools.
  • Ability to troubleshoot complex data issues and implement effective solutions.
  • Strong communication and interpersonal skills.
  • Ability to work collaboratively in a team-oriented environment.
  • Proactive in staying updated with industry trends and emerging technologies in data engineering.

Salary Range

Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $146,400 to $175,100 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

Equal Opportunity Employer

Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

  • Medical, dental, and vision coverage
  • Flexible Spending Account
  • 401k program
  • Competitive PTO offerings
  • Parental Leave
  • Opportunities for professional growth and development

Location:Remote

See more jobs at Blueprint Technologies

Apply for this job

+30d

Machine Learning Engineer - Recommender Systems (All Genders)

DailymotionParis, France, Remote
airflowDesigndockerpython

Dailymotion is hiring a Remote Machine Learning Engineer - Recommender Systems (All Genders)

Job Description

Joining the Dailymotion data team means taking part in the creation of our unique algorithms, designed to bring more diversity and nuance to online conversations.

Our Machine Learning team, established in 2016, has been actively involved in developing models across a diverse range of topics. Primarily, we focus on recommender systems, and extend our expertise to content classification, moderation, and search functionalities.

You will be joining a seasoned and diverse team of Senior Machine Learning Engineers, who possess the capability to independently conceptualize, deploy, A/B test, and monitor their models.

We collaborate closely with the Data Product Team, aligning our efforts to make impactful, data-driven decisions for our users.

Learn more about our ongoing projects:https://medium.com/dailymotion

As a Machine Learning Engineer, you will:

  • Design and deploy scalable recommender systems, handling billions of user interactions and hundreds of millions of videos.
  • Contribute to various projects spanning machine learning domains, encompassing content classification, moderation, and ad-tech.
  • Foster autonomy, taking ownership of your scope, and actively contribute ideas and solutions. Maintain and monitor your models in production.
  • Collaborate with cross-functional teams throughout the entire machine learning model development cycle:
    • Define success metrics in collaboration with stakeholders.
    • Engage in data collection and hypothesis selection with the support of the Data Analysts Team.
    • Conduct machine learning experiments, including feature engineering, model selection, offline validation, and A/B Testing.
    • Manage deployment, orchestration, and maintenance on cloud platforms with the Data Engineering Team.

Qualifications

  • Master's degree/PhD in Machine Learning, Computer Science, or a related quantitative field.
  • At least 1 year of professional experience working with machine learning models at scale (experience with recommender systems is a plus).
  • Proficient in machine learning concepts, with the ability to articulate theoretical concepts effectively.
  • Strong coding skills in Python & SQL.
  • Experience in building production ML systems; familiarity with technologies such as GitHub, Docker, Airflow, or equivalent services provided by GCP/AWS/Azure.
  • Experience with distributed frameworks is advantageous (Dataflow, Spark, etc.).
  • Strong business acumen and excellent communication skills in both English and French (fluent proficiency).
  • Demonstrated aptitude for autonomy and proactivity is highly valued.

See more jobs at Dailymotion

Apply for this job

+30d

Junior Creative Marketing Specialist at HolyWater

GenesisKyiv, UA Remote
tableauairflow

Genesis is hiring a Remote Junior Creative Marketing Specialist at HolyWater

See more jobs at Genesis

Apply for this job

+30d

Sr. Data Engineer

InMarketRemote - United States
agilescalaairflowDesignmobilescrumc++pythonAWS

InMarket is hiring a Remote Sr. Data Engineer

Job Title:Senior Data Engineer

Location: Remote - US Only

About inMarket

Since 2010, InMarket has been the leader in 360-degree consumer intelligence and real-time activation for thousands of today’s top brands. Through InMarket's data-driven marketing platform, brands can build targeted audiences, activate media in real time, and measure success in driving return on ad spend. InMarket's proprietary Moments offering outperforms traditional mobile advertising by 6x.* Our LCI attribution platform, which won the MarTech Breakthrough Award for Best Advertising Measurement Platform, was validated by Forrester to drive an average of $40 ROAS for our clients. 

*Source: Wordstream US Google Display Benchmarks for Mobile Media

 

About the Role

Join one of the fastest growing teams at InMarket as a Senior Engineer to help us transform the advertising industry using the latest innovations in technologies such as Scala, Python, Spark, best practices like complete test coverage, and all the greatest offerings of the AWS and GCP ecosystems.  

In this role you will have the opportunity to work with the Data Engineering team to ensure all of InMarket’s products get the data they need, and provide powerful metrics and insights for business leaders. You would also have the opportunity to learn, further develop, and become an expert in Spark, BigQuery, and other big data platforms. Haven’t used some of these before? We believe continuous learning is a cornerstone value of any talented engineer, and what better way to learn than by building high-quality products.

 

Your Daily Impact as a Senior Data Engineer:

  • Work with closely with other Data Engineering team members
  • Participate in daily scrum
  • Work closely with product and engineering leads to scope and complete projects
  • Work on data pipeline features across AWS and GCP to ensure products get the data they need in a fault-tolerant and testable manner
  • Maintain clear metrics and test coverage of our pipelines
  • Take ownership of features and lead all life cycle stages for them including requirement analysis, design, development, testing and deployment
  • Work closely with Cloud and other engineering teams on a rotation basis to handle reported bugs/issues with the platform
  • Perform code reviews for your piers

Your Expertise & Experience:

  • Bachelor’s Degree in Computer Science
  • 6+ years experience in software development
  • 3+ years experience in Spark, Scala, Python, Airflow
  • Experience testing data pipelines
  • Experience working with databases, distributed systems, concurrency
  • Experience working with Cloud IaaS (e.g. AWS, GCP)
  • Experience with Google Cloud BigQuery

Nice to Have:

  • Experience with ML
  • Experience with BigQuery
  • Experience with Scala

Benefits Summary

  • Competitive salary, stock options, flexible vacation
  • Medical, dental and Flexible Spending Account (FSA)
  • Company Matched 401(k)
  • Unlimited PTO (Within reason)
  • Talented co-workers and management
  • Agile Development Program (For continued learning/professional development)
  • Paid Paternity & Maternity Leave

 

inMarket is an Equal Opportunity Employer (EOE). Qualified applicants are considered for employment without regard to age, race, color, religion, sex, national origin, sexual orientation, disability, or veteran status.

For candidates in California, Colorado, and New York City, the Targeted Base Salary Range for this role is $135,000 to $168,480. 

Actual salaries will vary depending on factors including but not limited to work experience, specialized skills and training, performance in role, business needs, and job requirements. Base salary is subject to change and may be modified in the future. Base salary is just one component of InMarket’s total rewards package that also may include bonus, equity, and benefits.  Ask your recruiter for more information!

InMarket is an Equal Opportunity Employer (EOE). Qualified applicants are considered for employment without regard to age, race, color, religion, sex, national origin, sexual orientation, disability, or veteran status.

Privacy Notice for California Job Applicants: https://inmarket.com/ca-notice-for-job-applicants/

#LI-Remo

See more jobs at InMarket

Apply for this job

+30d

Sr. Software Engineer I, Infrastructure

Khan AcademyMountain View, CA / Remote friendly (US + Canada Only)
remote-firstairflowgraphqlc++typescriptredux

Khan Academy is hiring a Remote Sr. Software Engineer I, Infrastructure

ABOUT KHAN ACADEMY

Khan Academy is a nonprofit with the mission to deliver a free, world-class education to anyone, anywhere. Our proven learning platform offers free, high-quality supplemental learning content and practice that cover Pre-K - 12th grade and early college core academic subjects, focusing on math and science. We have over 155 million registered learners globally and are committed to improving learning outcomes for students worldwide, focusing on learners in historically under-resourced communities.

OUR COMMUNITY 

Our students, teachers, and parents come from all walks of life, and so do we. Our team includes people from academia, traditional/non-traditional education, big tech companies, and tiny startups. We hire great people from diverse backgrounds and experiences because it makes our company stronger. We value diversity, equity, inclusion, and belonging as necessary to achieve our mission and impact the communities we serve. We know that transforming education starts in-house with learning about ourselves and our colleagues. We strive to be world-class in investing in our people and commit to developing you as a professional.

THE ROLE

Currently we are focused on providing equitable solutions to historically under-resourced communities of learners and teachers, and guided by our Engineering Principles. On the infrastructure team, you might work on projects such as:

  • Research, define, and implement metrics to measure our distributed system's cost, reliability and performance in order to keep our engineering teams focused on the most crucial work that will support millions of learners and teachers.
  • Scale the infrastructure powering our LLM tutor Khanmigoto support millions of learners and teachers.
  • Work across engineering teams and the broader organization to identify, define and promote engineering best practices around topics like incident management, observability, and error handling.
  • Explore and implement chaos engineering principles into our testing processes.

We strive to build using technology that is best suited to solving problems for our learners. Currently, we build with GCP, Go, GraphQL, TypeScript, React & React Native, Redux and we adopt new technologies when they’ll help us better achieve our goals. At Khan, one of our values is “Cultivate Learning Mindsets”, so for us, it’s important that we’re working with all of our engineers to help match the right opportunity to the right individual, in order to ensure every engineer is operating at their “learning edge”.

You can read about our latest work on our Engineering Blog. A few highlights:

WHAT YOU BRING

  • 5+ years of experience building and supporting highly scalable web services handling large volumes of requests per second, and working with distributed, eventually-consistent databases.
  • Strong technical project management skills, as our Senior Software Engineers often lead projects.
  • Strong communication, thoughtfulness, and desire to give and receive regular feedback.
  • Experience building and maintaining complex software. You’ll join us in writing clean, maintainable software that solves hard problems. You’ll write testable, quality code. You’ll push the team and the mission forward with your contributions.
  • Empathy for learners around the world. You love learning and are excited about helping others learn to love learning. You’re motivated to learn new things and share what you learn with the world.
  • Proven cross-cultural competency skills demonstrating self-awareness, awareness of other, and the ability to adopt inclusive perspectives, attitudes, and behaviors to drive inclusion and belonging throughout the organization.
  • Motivated by the Khan Academy mission “to provide a free world-class education for anyone, anywhere."

Note: We welcome candidates with experience in any and all technologies. We don’t require experience in any particular language or tool. Our commitment to on-boarding and mentorship means you won’t be left in the dark as you learn new technologies.

PERKS AND BENEFITS

We may be a non-profit, but we reward our talented team extremely well! We offer:

  • Competitive salaries
  • Ample paid time off as needed – Your well-being is a priority.
  • Remote-first culture - that caters to your time zone, with open flexibility as needed, at times
  • Generous parental leave
  • An exceptional team that trusts you and gives you the freedom to do your best
  • The chance to put your talents towards a deeply meaningful mission and the opportunity to work on high-impact products that are already defining the future of education
  • Opportunities to connect through affinity, ally, and social groups
  • And we offer all those other typical benefits as well: 401(k) + 4% matching & comprehensive insurance, including medical, dental, vision, and life

At Khan Academy we are committed to fair and equitable compensation practices, the well-being of our employees, and our Khan community. This belief is why we have built out a robust Total Rewards package that includes competitive base salaries, and extensive benefits and perks to support physical, mental, and financial well-being.

The target salary range for this position is $137,781 - $172,339 USD / $172,226 - $215,424 CAD. The pay range for this position is a general guideline only. The salary offered will depend on internal pay equity and the candidate’s relevant skills, experience, qualifications, and job market data. Exceptional performers in this role who make an outsized contribution can make well in excess of this range.  Additional incentives are provided as part of the complete total rewards package in addition to comprehensive medical and other benefits.

MORE ABOUT US

OUR COMPANY VALUES

Live & breathe learners

We deeply understand and empathize with our users. We leverage user insights, research, and experience to build content, products, services, and experiences that our users trust and love. Our success is defined by the success of our learners and educators.

Take a stand

As a company, we have conviction in our aspirational point of view of how education will evolve. The work we do is in service to moving towards that point of view. However, we also listen, learn and flex in the face of new data, and commit to evolving this point of view as the industry and our users evolve.

Embrace diverse perspectives

We are a diverse community. We seek out and embrace a diversity of voices, perspectives and life experiences leading to stronger, more inclusive teams and better outcomes. As individuals, we are committed to bringing up tough topics and leaning into different points of view with curiosity. We actively listen, learn and collaborate to gain a shared understanding. When a decision is made, we commit to moving forward as a united team.

Work responsibly and sustainably

We understand that achieving our audacious mission is a marathon, so we set realistic timelines and we focus on delivery that also links to the bigger picture. As a non-profit, we are supported by the generosity of donors as well as strategic partners, and understand our responsibility to our finite resources. We spend every dollar as though it were our own. We are responsible for the impact we have on the world and to each other. We ensure our team and company stay healthy and financially sustainable.

Bring out the joy

We are committed to making learning a joyful process. This informs what we build for our users and the culture we co-create with our teammates, partners and donors.

Cultivate learning mindset

We believe in the power of growth for learners and for ourselves. We constantly learn and teach to improve our offerings, ourselves, and our organization. We learn from our mistakes and aren’t afraid to fail. We don't let past failures or successes stop us from taking future bold action and achieving our goals.

Deliver wow

We insist on high standards and deliver delightful, effective end-to-end experiences that our users can rely on. We choose to focus on fewer things — each of which aligns to our ambitious vision — so we can deliver high-quality experiences that accelerate positive measurable learning with our strategic partners.

We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, gender, gender identity or expression, national origin, sexual orientation, age, citizenship, marital status, disability, or Veteran status. We value diversity, equity, and inclusion, and we encourage candidates from historically underrepresented groups to apply.

See more jobs at Khan Academy

Apply for this job

+30d

Data Engineer

scalaairflowsqljavapython

Chattermill is hiring a Remote Data Engineer

Data Engineer

???? UK/EU (Hybrid or Remote)

???? £Competitive

 

⭐️ Our Perks ⭐️ 

❤️ Monthly Health & Wellness budget, increasing with length of service

???? Annual Learning and Development budget

????????‍♂️ Flexible working in a choice first environment - we trust the way you want to work

???? WFH Equipment (let us know what you need and we’ll get it for you!)

???? 25 Holiday Days + your local bank holidays, plus an extra day for every year of service

???? Your birthday off

???? Paid sick leave

???? Enhanced Family Leave - (UK Only)

⚕️ Optional healthcare plan

???? The ability to share in the company’s success through options

???? Perks including discounts on cinema tickets, utilities and more

???? Annual Chattermill summits plus regular socials throughout the year

???? If you’re in London, a dog friendly office with great classes, events, and a rooftop terrace

 

????‍♀️ The Role of Data Engineer

As our Data Engineer, you'll play a crucial part in integrating 3rd party customer data into our cutting-edge platform. You'll get hands-on experience with Python, Airbyte, DBT, Dagster, BigQuery, and Kafka, making you a key player in our data orchestration processes.

???? What you'll be doing as Data Engineer:

  • Building integrations to import 3rd party customer data using Python and various data orchestration tools (Airbyte, DBT, Dagster, etc.).
  • Developing and maintaining data transformations and mappings, ensuring data integrity and efficiency.
  • Using Python to write and optimise data integration and transformation pipelines.
  • Working with BigQuery and Kafka to manage large-scale data.
  • Supporting and diagnosing issues within data pipelines to ensure seamless data flow.
  • Collaborating closely with team members on various data engineering tasks.

???? What you’ll need:

  • Data Expertise: A solid foundation in working with data is highly desirable, whether through a relevant degree, university-level projects, completion of a data-focused course, or equivalent work experience in a professional tech environment, such as a role as a data engineer, data analyst, or similar.
  • Coding Skills: Experience in Python and SQL coding is highly advantageous. Your ability to navigate and manipulate data through these languages will be an important part in this role.
  • Data Cleaning and Process Mapping: Demonstrated capability in cleaning and refining data, coupled with the ability to understand and map out processes.

It'd be a bonus if you have:

  • Knowledge or experience in other programming languages like Java, Scala, Golang, Rust, or Ruby.
  • Knowledge or experience in data engineering tools like Airbyte, DBT, Dagster, BigQuery, Kafka, and similar technologies is a significant bonus.
  • Familiarity with similar alternative technologies such as Airflow, Snowflake, Databricks, Redshift, Athena, Apache Spark, Fivetran, Prefect, Pandas, Polars, Parquet, DuckDB is also acceptable.

 

Chattermill - Who we are:

Co-founded by Mikhail Dubov and Dmitry Isupov in 2015 while at Entrepreneur First, Chattermill was born out of their frustration that it took weeks, sometimes months, for customer research to yield any quality insights. Often, these would be out of date by the time they reached decision-makers. And it was also financially out of reach for most companies.

When they started what eventually became Chattermill, they had a hunch that they could use the newly available tech of deep learning to help companies find insights amidst messy data. Their vision was to take what agencies and cutting-edge brands were doing by hand and automate it.

Today, our Unified Customer Intelligence platform is used by the world’s best-loved customer-centric companies including Uber, HelloFresh, Wise, and more, all of whom can now see, and act on their customer reality.

Our Mission, Vision, and Purpose 

  • Our mission is to empower teams to see their customers reality 
  • Our vision is to Analyse Over a Billion Pieces of Customer Feedback by 2027

 

???? Our Hiring Process

  1. Let’s introduce ourselves – you’ll have an introductory call with our Talent team - we’d love to learn more about you, your ambitions, and what you’re looking for in your next step. 
  2. Get to know your would-be team – you’ll have a call with your would-be manager, Dean Cornish (Engineering Manager - Data Platform), to learn more about the role and show off your experience 
  3. Show us what you are made of – you’ll complete a short task, which you’ll then run through on a call with the team 
  4. How our values and your career goals align – you’ll have a call with our cofounder to learn more about life at Chattermill and ensure we’re the right place for your next stage of growth

 

???? Our Values 

????‍♂️ We are obsessed with experience – We take our mission to rid the world of bad Customer Experience seriously, and we practice what we preach.

???? We believe in the power of of trust – Whether it's with each other, our customers, partners, or other stakeholders we always communicate with openness and trust.

???? We act as responsible owners – Whether it's about the company, a team, a project, or a task, having the freedom to make decisions in our area of responsibility is a crucial driver for us.

???? We share a passion for growth & progress – On every level, we’re motivated by taking on new challenges – even if they seem out of reach. We recognise that we are learning machines and we always seek to action feedback and improve collectively.

???? We set our ambitions high but stay humble – We've come together to build a product and a category that’s never been seen before. While we're an ambitious bunch with lofty goals, we don't approach this goal recklessly.

???? We believe the right team is the key to success – At Chattermill we’ve learned that all our important achievements have been the result of the right people collaborating together – that’s why we need you to apply today!

 

???? Diversity & Inclusion

We want to enable exceptional experiences for everyone, and to achieve this we need everyone’s voice in our team.  We are on a mission to bring more diversity into the business in 2023 and to give everyone (from all backgrounds and abilities) a chance to join us, even if they may not fit all of the requirements set out in this job spec.

We realise that some may be hesitant to apply for a role when they don’t meet 100% of the listed requirements – we believe in potential and will happily consider all applications based on the skills and experience you have, we’d love to be part of your growth and we encourage you to apply!

We believe in removing unconscious biases from our recruitment process wherever possible.  As part of this effort, we ask that you do not include your photograph or personal details with your application.

 

 

 

Key words: Data Engineer, Data Ops, Data Analyst, Coding, Python, SQL, Data Expert, Data Cleaning, Process Mapping, Data Engineering, Airbyte, DBT, Dagster, BigQuery, Kafka, Graduate, Data Ops Executive

See more jobs at Chattermill

Apply for this job

+30d

Senior ML Engineer

SecurityScorecardRemote - Canada
Bachelor's degreeterraformairflowpostgresDesignansibleazuregitc++dockerkubernetesjenkinspythonAWS

SecurityScorecard is hiring a Remote Senior ML Engineer

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of theWorld’s Most Innovative Companies for 2023and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Role:

We are seeking an experienced Senior ML Engineer to join our Data Science team. In this role, you will collaborate with a cross functional team of ML engineers, data engineers and data science researchers. You will collaborate with other experts to design, build, deploy, and operate production pipelines and microservice systems with a focus on ML best practices. You will build and manage infrastructure including feature stores, data mesh and our AI platform, creating automation for training, delivery, and updating of our machine learning models.  If you're a problem solver, effective communicator, and enthusiastic about driving advancements in AI and ML in the security space, we want you on our team.

What You'll Do:

  • Own and lead the creation, operation and maintenance of critical infrastructure projects and automation for the data science team to empower data science research and ML model delivery.
  • Train and mentor team members in applying best practices in operations and security.
  • Provide code reviews and feedback on Github pull requests.
  • Identify opportunities for technical and process improvement and implementation.
  • Knowledge and application of best practices such as immutable containers, Infrastructure as Code, stateless applications, and software observability.
  • Tune large-scale distributed system performance to achieve SLA metrics such as stability, uptime, scalability, and low latency while keeping costs under control.
  • Continuous improvement of CI/CD processes to automate builds and deployments.
  • Collaborate with scientists and engineers to understand KPIs and configure observability, monitoring, and alerting to support operations.
  • Setup Terraform / Kubernetes and associated tooling to support data pipelines, feature stores, data mesh and delivery of machine learning models.
  • Diagnose and correct networking issues or communicating problems clearly enough such that centralized IT teams can resolve.
  • Decompose system layer abstractions to investigate and determine root cause issues and resolve complex distributed system performance problems. 

What We Need You To Have:

  • 4-5+ years experience in ML / DevOps in the cloud (AWS, GCP, or Azure).
  • Experience with Apache Spark and big data streaming infrastructure (data lakes, Snowflake, Databricks, S3).
  • Production environment experience with Amazon Web Services (AWS) or equivalent.
  • Experience supporting data stores such as RDMBS (Postgres), KVS (Cassandra / ScyllaDB) and queues / streaming (Kafka).
  • Skilled with Terraform, Git, Python, bash / shell scripting, and Docker containers.
  • Experienced with CI/CD processes (Jenkins, Ansible) and automated configuration tools (Terraform, Ansible, etc.).
  • Experience setting up container orchestration (AWS ECS, Kubernetes / K8s).
  • Skilled with dashboard creation and monitoring with tools such as Prometheus and DataDog.
  • Capable of planning out future infrastructure and projecting timelines.
  • Ability to work with our highly collaborative team.
  • Strong written and verbal communication skills.
  • Willingness to teach and mentor others.

Preferred Qualifications:

  • You have a bachelors or greater in computer science, STEM or related field.
  • You’ve implemented data mesh and feature stores.
  • Strong understanding of networking concepts, including OSI layers, firewalls, DNS, split-horizon DNS, VPN, routing, BGP, etc.
  • Skilled with tools such as Ray, Airflow, Argo, Kubeflow, MLFlow, and vector databases.

Benefits:

Specific to each country, we offer a competitive salary, stock options, Health benefits, and unlimited PTO, parental leave, tuition reimbursements, and much more!

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.  Please note that we do not provide immigration sponsorship for this position.

 

See more jobs at SecurityScorecard

Apply for this job