Data Engineer Remote Jobs

88 Results

8h

Principal Data Engineer

sliceBelfast or remote UK
scalanosqlsqlDesignjavapython

slice is hiring a Remote Principal Data Engineer

UK Remote or Belfast

Serial tech entrepreneur Ilir Sela started Slice in 2010 with the belief that local pizzerias deserve all of the advantages of major franchises without compromising their independence. Starting with his family’s pizzerias, we now empower over 18,000 restaurants (that’s nearly triple Domino’s U.S. network!) with the technology, services, and collective power that owners need to better serve their digitally minded customers and build lasting businesses. We’re growing and adding more talent to help fulfil this valuable mission. That’s where you come in.

 

The Challenge to Solve

Provide Slice with up to date data to grow the business and to empower independent pizzeria owners to make the best data driven decisions through insights that ensure future success.

 

The Role

You will be responsible for leading data modelling and dataset development across the team. You’ll be at the forefront of our data strategy. Partnering closely with business and product teams,  to fuel data-driven decisions throughout the company. Your leadership will guide our data architecture expansion, ensuring smooth data delivery and maintaining top-notch data quality. Drawing on your expertise, you’ll steer our tech choices and keep us at the cutting edge of the field. You’ll get to code daily and provide your insights into best practices to the rest of the team.

 

The Team

You’ll work with a team of skilled data engineers daily, providing your expertise to their reviews, as well as working on your own exciting projects with teams across the business. You’ll have a high degree of autonomy and the chance to impact many areas of the business. You will optimise data flow and collection for cross-functional teams and support software developers, business intelligence, and data scientists on data initiatives using this to help to support product launches and supporting Marketing efforts to grow the business. This role reports to the Director of Data Engineering.

 

The Winning Recipe

We’re looking for creative, entrepreneurial engineers who are excited to build world-class products for small business counters. These are the core competencies this role calls for:

  • Strong track record of designing and implementing modern cloud data processing architectures using programming languages such as Java, Scala, or Python and technologies like Spark
  • Expert-level SQL skills
  • Extensive experience in data modelling and design, building out the right structures to deliver for various business and product domains
  • Strong analytical abilities and a history of using data to identify opportunities for improvement and where data can help drive the business towards its goals
  • Experience with message queuing, stream processing using frameworks such as Flink or KStreams and highly scalable big data data stores, as well as storage and query pattern design with NoSQL stores
  • Proven leadership skills, with a track record of successfully leading complex engineering projects and mentoring junior engineers, as well as working with cross-functional teams and external stakeholders in a dynamic environment
  • Familiarity with serverless technologies and the ability to design and implement scalable and cost-effective data processing architectures

 

The Extras

Working at Slice comes with a comprehensive set of benefits, but here are some of the unexpected highlights:

  • Access to medical, dental, and vision plans
  • Flexible working hours
  • Generous time off policies
  • Annual conference attendance and training/development budget
  • Market leading maternity and paternity schemes
  • Discounts for local pizzerias (of course)

 

The Hiring Process

Here’s what we expect the hiring process for this role to be, should all go well with your candidacy. This entire process is expected to take 1-3 weeks to complete and you’d be expected to start on a specific date.

  1. 30 minute introductory meeting
  2. 30 minute hiring manager meeting
  3. 60 minute pairing interview
  4. 45 minute interview interview
  5. 30 minute CTO interview
  6. Offer!

Pizza brings people together. Slice is no different. We’re an Equal Opportunity Employer and embrace a diversity of backgrounds, cultures, and perspectives. We do not discriminate on the basis of race, colour, gender, sexual orientation, gender identity or expression, religion, disability, national origin, protected veteran status, age, or any other status protected by applicable national, federal, state, or local law. We are also proud members of the Diversity Mark NI initiative as a Bronze Member.

See more jobs at slice

Apply for this job

2d

Senior Data Engineer

Nile BitsCairo, Egypt, Remote
agileairflowsqlDesigndockerlinuxpythonAWS

Nile Bits is hiring a Remote Senior Data Engineer

Job Description

  • Designing and implementing core functionality within our data pipeline in order to support key business processes
  • Shaping the technical direction of the data engineering team
  • Supporting our Data Warehousing approach and strategy
  • Maintaining our data infrastructure so that our jobs run reliably and at scale
  • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
  • Mentoring and upskilling other members of the team

Qualifications

  • Experience building data pipelines and/or ETL processes
  • Experience working in a Data Engineering role
  • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
  • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
  • Confident writing SQL and good understanding of database design.
  • Experience working with web APIs.
  • Experience leading projects from a technical perspective
  • Knowledge of Docker, shell scripting, working with Linux
  • Experience with a cloud data warehouse
  • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
  • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
  • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
  • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
  • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
  • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
  • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
  • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
  • Experience with AWS services or those of another cloud provider
  • Experience with Snowflake
  • Good understanding of Agile

See more jobs at Nile Bits

Apply for this job

8d

Data Engineer

MetioraMadrid, Spain, Remote
nosqlsqlazuregitc++dockerlinuxpythonAWS

Metiora is hiring a Remote Data Engineer

Descripción del empleo

Estamos buscando a un/a #excepcional Data engineer ???? que sea capaz de entender los retos de nuestros clientes, hacerlos suyos y que nos ayude a establecer relaciones a largo plazo con ellos, garantizando el éxito y la ejecución de los proyectos, teniendo como funciones:

  • Desarrollar procesos de integración con nuestros clientes para poder explotar sus datos desde nuestra plataforma MINEO y plataformas cloud del cliente (Azure, AWS, GCP)
  • Ayudar a la mejora de las actuales herramientas de integración (ETL)
  • Entender el dato del cliente y poder anticiparse a los posibles problemas que puedan surgir
  • Elaborar análisis de los cambios y las nuevas funcionalidades que se vayan a desarrollar
  • Realizar code reviews de las tareas llevadas a cabo por los compañeros

¿Qué esperamos de tu perfil profesional? 

  • Grado en carreras STEM,  Matemáticas, Estadística, ingeniería de Telecomunicaciones o Informática
  • Formación adicional en ciencia de datos
  • Al menos entre 2-5 años de experiencia en proyectos reales
  • Proactividad y pasión por la tecnología
  • Ganas de trabajar en equipo
  • Curiosidad intelectual y persistencia para resolver problemas
     

Requisitos

  • Grado o Máster en Ingeniería informática o titulación similar

  • Background en desarrollo de software. Tienes que ser capaz de entender  y desarrollar código, especialmente Python

  • Manejo avanzado de bases de datos SQL y NoSQL

  • Conocimientos avanzados de algoritmia

  • Conocimiento en Git

Soft skills:

  • Ganas de trabajar en equipo

  • Enfocado en la calidad, escalabilidad y código limpio

  • Curiosidad intelectual y persistencia para resolver problemas

  • Buen nivel de inglés

  • Proactividad y pasión por la tecnología

Se valorará muy positivamente:

  • Soluciones de "containerization" (Docker)

  • Conocimientos de algoritmia para desarrollo de procesos de grandes cantidades de datos

  • Conocimiento de sistema Linux

See more jobs at Metiora

Apply for this job

8d

Data Engineer Intern

ProgressHybrid Remote, Sofia, Bulgaria
sql

Progress is hiring a Remote Data Engineer Intern

Progress is an experienced, trusted provider of products designed with customers in mind, so they can develop the applications they need, deploy where and how they want and manage it all safely and securely. We take pride in what we do, always valuing the whole person—at work and in life. Our diverse life experiences enrich our culture because people power progress. 

The Data Warehouse team is responsible for extracting data from various business systems and consolidating it into a centralized location optimized for reporting and analysis. We closely collaborate with our reporting and analytical counterparts to ensure we have the necessary data to satisfy the needs of top management, finance, sales, marketing, and support organizations. You will have the opportunity to join a dynamic and collaborative team environment, working alongside experienced professionals and utilizing cutting-edge technologies.

As a Data Warehouse Intern, you will have the opportunity to gain valuable hands-on experience in managing, organizing, and analyzing data within our company's data warehouse. You will work closely with our data engineering and analytics teams to support various data-related initiatives and projects.  

Some of the key learning areas include:

  • Gain understanding of data warehousing principles, including data modeling, ETL processes, data storage, and retrieval mechanisms.
  • Enhance your SQL skills by writing and optimizing queries to extract, manipulate, and analyze data stored in the data warehouse.
  • Work with various database management systems (DBMS) and understand their functionalities, such as data indexing, partitioning, and optimization.
  • Learn how to ensure data accuracy, consistency, and completeness through quality assurance procedures and data validation techniques.
  • Strengthen your analytical and problem-solving skills by tackling real-world data challenges and finding innovative solutions.
  • Benefit from mentorship opportunities and exposure to real-world projects that will help you develop professionally and prepare for a successful career in data management and analytics.

What you’ll do in this role:   

  • Assist in the development and maintenance of data warehouse infrastructure and processes.
  • Collaborate with data engineers and analysts to understand data requirements and assist in designing data models.             
  • Support the implementation and integration of new data sources into the data warehouse.
  • Assist in creating and maintaining documentation related to data warehouse processes and procedures.
  • Collaborate with cross-functional teams to understand business needs and provide data-driven insights.

About you:  

  • Having or currently pursuing a degree in Computer Science, Information Systems, Data Science, or a related field.
  • Basic understanding of relational database concepts and SQL.
  • Strong verbal and written communication skills in English.
  • Good analytical and problem-solving skills.
  • Strong attention to detail and ability to work independently as well as part of a team.
  • Eagerness to learn and a proactive attitude towards tackling new challenges.

What we offer in return is the opportunity to join a talented team of bright and nice people and also to enjoy:  

  • A generously paid internship program   
  • 15 days’ vacation for 6 months + an extra day off for your birthday  
  • A dedicated mentor and a detailed onboarding plan to get up to speed;  
  • A possibility of future job opportunities based on performance and hiring needs after the internship 
  • Premium healthcare and dental care coverage  
  • A modern office with a well-equipped gym onsite 

#LI-VL1
#LI-Hybrid

Together, We Make Progress

Progress is an inclusive workplace where opportunities to succeed are available to everyone. As a multicultural company serving a global community, we encourage a wide range of points of view and celebrate our diverse backgrounds. Our unique combination of perspectives inspires innovation, connects us to our customers and positively affects our communities. It is only by working together and learning from each other that we make Progress. Join us!

See more jobs at Progress

Apply for this job

8d

Data Engineer Azure

4 years of experience2 years of experienceagileBachelor's degreetableaujirascalaairflowpostgressqloracleDesignmongodbpytestazuremysqljenkinspythonAWS

FuseMachines is hiring a Remote Data Engineer Azure

Data Engineer Azure - Fusemachines - Career Page
8d

Data Services Engineer

DevoteamWarszawa, Poland, Remote
agilenosqlDesignazurejavapythonAWS

Devoteam is hiring a Remote Data Services Engineer

Job Description

We are looking for a highly motivated Data Engineer to use their experience in the public cloud to interpret our customers’ needs and extract value from their data using GCP and its suite of tools. You will primarily work as part of the data team.

You will be involved in pre-sales activities, building upon our customers’ cloud infrastructure and creating analytics in the cloud. You will be involved in designing and constructing data pipelines and architecture, using GCP products. You will help turn big data into valuable business insights using Machine Learning. Programming and preparing custom solutions optimised for each client will be an essential part of your job. We believe that working with data is your passion that you want to share with others.

We want you to know and advocate for Google Cloud Platform and its products. Knowledge in equivalent platforms could be a valuable asset, i.e. AWS or Azure, to make you a strong candidate.

It’s a customer-facing role, so you must be outgoing and confident in your ability to interact with clients, manage workshops and execute necessary training on your own.

Qualifications

  • BA/BS in Computer Science or related technical field, or equivalent experience
  • Experience in one or more development languages, with a strong preference for Python, Java
  • Good knowledge of Power BI 
  • Basic knowledge of Bash 
  • Experience with programming in agile methodology
  • Knowledge of database and data analysis technologies, including relational and NoSQL databases, data warehouse design, and ETL pipeline.
  • Fluency in English (both written and spoken)

Nice to have:

  • Experience in drawing UML diagrams
  • Experience in Hadoop and using platforms such as Apache Spark, Pig, or Hive
  • Fluency in Polish (both written and spoken)

See more jobs at Devoteam

Apply for this job

11d

Data Engineer

LegalistRemote
agilenosqlsqlDesignc++dockerkubernetesAWS

Legalist is hiring a Remote Data Engineer

Intro description:

Legalist is an institutional alternative asset management firm. Founded in 2016 and incubated at Y Combinator, the firm uses data-driven technology to invest in credit assets at scale. We are always looking for talented people to join our team.

As a highly collaborative organization, our data engineers work cross-functionally with software engineering, data science, and product management to optimize growth and strategy of our data pipeline. In this position, you will be joining the data engineering team in an effort to take our data pipeline to the next level.

Where you come in:

  • Design and develop scalable data pipelines to collect, process, and analyze large volumes of data efficiently.
  • Collaborate with cross-functional teams including data scientists, software engineers, and product managers to understand data requirements and deliver solutions that meet business needs.
  • Develop ELT processes to transform raw data into actionable insights, leveraging tools and frameworks such as Airbyte, BigQuery, Dagster, DBT or similar technologies.
  • Participate in agile development processes, including sprint planning, daily stand-ups, and retrospective meetings, to deliver iterative improvements and drive continuous innovation.
  • Apply best practices in data modeling and schema design to ensure data integrity, consistency, and efficiency.
  • Continuously monitor and optimize data pipelines and systems for performance, availability, scalability, and cost-effectiveness.

What you’ll be bringing to the team:

  • Bachelor’s degree (BA or BS) or equivalent.
  • A minimum of 2 years of work experience in data engineering or similar role.
  • Advanced SQL knowledge and experience working with a variety of databases (SQL, NoSQL, Graph, Multi-model).
  • A minimum of 2 years professional experience with ETL//ELT, data modeling and Python.
  • Familiarity with cloud environments like GCP, AWS, as well as cloud solutions like Kubernetes, Docker, BigQuery, etc.
  • You have a pragmatic, data-driven mindset and are not dogmatic or overly idealistic about technology choices and trade-offs.
  • You have an aptitude for learning new things quickly and have the confidence and humility to ask clarifying questions.

Even better if you have, but not necessary:

  • Experience with one or more of the following: data processing automation, data quality, data warehousing, data governance, business intelligence, data visualization.
  • Experience working with TB scale data.

See more jobs at Legalist

Apply for this job

13d

Senior Data Engineer

IndigoRemote with Frequent Travel
sqlDesignpythonAWSjavascriptNode.js

Indigo is hiring a Remote Senior Data Engineer

Company Description

Healthcare providers spend roughly $20B annually on premiums for medical professional liability (“MPL”) insurance, $5-6B of which is spent for physicians. Incumbent carriers utilize outdated risk selection, underwriting, and sales processes. For the first time in 10 years, The MPL market is currently in a hardening cycle. While incumbent carriers are increasing rates to make up for underwriting losses, the environment is ripe for an innovative disruptor to capture market share by deploying artificial intelligence.

Rubicon Founders, an entrepreneurial firm focused on building and growing transformational healthcare companies, has launched Indigo to address this issue. Backed by Oak HC/FT, this company will disrupt the medical professional liability market by offering more competitive products to providers. Benefitting from significant potential cost savings as a result, providers may reallocate resources to invest more directly in patient or staff care. This company intends to fulfill Rubicon Founder’s mission of creating enduring value by impacting people in a measurable way.   

Position Description

Are you an expert data engineer with a passion for making a difference in the healthcare industry? If so, we want you to join our team at Indigo, where we're using technology to transform medical professional liability insurance. Our AI driven underwriting process and modern technology streamline the insurance process and equip quality physicians and groups with the coverage they need.

As a senior data engineer you will own the various data components of our system, which includes our data model architecture, our data pipeline / ETL, and any ML data engineering needs. You will help us leverage existing data platforms and databases with best practices to ensure that the data strategy is coherent, that our data systems are highly available, secure, and scalable. You will work with cross-functional teams including business stakeholders, product managers, and data scientists to ensure our data pipeline system meets business requirements and adheres to best practices.  

 

We are a remote distributed team who gathers with intention. You will thrive here if you find energy working from home and getting together to build relationships. At Indigo, you'll have the opportunity to contribute to a meaningful mission that makes a difference in the lives of healthcare providers and their patients.

 

Responsibilities:

  • Design and build the Indigo data pipeline and ETL process.
  • Design and build the Indigo application datastore architecture and implementation 
  • Define a governance and lineage strategy.
  • Define and implement security across our data systems.
  • Participate in code reviews and ensure the code is of high quality.
  • Collaborate with cross-functional teams including product and engineering to understand business requirements and translate them into technical requirements.
  • Keep up to date with emerging trends in data engineering and contribute to improving our development processes.

Requirements:

  • Bachelor’s degree in CS or similar; masters advantageous
  • 5+ years of data engineering experience 
  • Expert SQL and Python or Javascript (Node.js)
  • Experience with AWS services such as EC2, Lambda, RDS, DynamoDB, and S3.
  • Experience with Snowflake, dbt
  • Strong understanding of data design principles
  • Experience designing and building data products from concept to productization.
  • Experience with automation in the processes of data definition, migration.
  • Excellent design and problem-solving skills.
  • Self starter, highly autonomous, thrives in a remote distributed environment



See more jobs at Indigo

Apply for this job

13d

Senior Data Engineer

airflowpostgressqloracleDesigndockermysqlkubernetespythonAWS

ReCharge Payments is hiring a Remote Senior Data Engineer

Who we are

In a world where acquisition costs are skyrocketing, funding is scarce, and ecommerce merchants are forced to do more with less, the most innovative DTC brands understand that subscription strategy is business strategy.

Recharge is simplifying retention and growth for innovative ecommerce brands. As the #1 subscription platform, Recharge is dedicated to empowering brands to easily set up and manage subscriptions, create dynamic experiences at every customer touchpoint, and continuously evaluate business performance. Powering everything from no-code customer portals, personalized offers, and customizable bundles, Recharge helps merchants seamlessly manage, grow, and delight their subscribers while reducing operating costs and churn. Today, Recharge powers more than 20,000 merchants serving 90 million subscribers, including brands such as Blueland, Hello Bello, CrunchLabs, Verve Coffee Roasters, and Bobbie—Recharge doesn’t just help you sell products, we help build buyer routines that last.

Recharge is recognized on the Technology Fast 500, awarded by Deloitte, (3rd consecutive year) and is Great Place to Work Certified.

Overview

The centralized Data and Analytics team at Recharge delivers critical analytic capabilities and insights for Recharge’s business and customers. 

As a Senior Data Engineer, you will build scalable data pipelines and infrastructure that power internal business analytics and customer-facing data products.  Your work will empower data analysts to derive deeper strategic insights from our data, and will  enable developers to build applications that surface data insights directly to our merchants. 

What you’ll do

  • Build data pipeline, ELT and infrastructure solutions to power internal data analytics/science and external, customer-facing data products.

  • Create automated monitoring, auditing and alerting processes that ensure data quality and consistency.

  • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models

  • Design, develop, implement, and optimize existing ETL processes that merge data from disparate sources for consumption by data analysts, business owners, and customers

  • Seek ways to continually improve the operations, monitoring and performance of the data warehouse

  • Influence and communicate with all levels of stakeholders including analysts, developers, business users, and executives.

  • Live by and champion our values: #day-one, #ownership, #empathy, #humility.

What you’ll bring

  • Typically, 5+ years experience in a data engineering related role (Data Engineer, Data Platform Engineer, Analytics Engineer etc) with a track record of building scalable data pipeline, transformation, and platform solutions. 

  • 3+ years of hands-on experience designing and building data pipelines and models  to ingesting, transforming and delivery of large amounts of data, from multiple sources into a Dimensional (Star Schema) Data Warehouse, Data Lake.

  • Experience with a variety of data warehouse, data lake and enterprise data management platforms (Snowflake {preferred}, Redshift, databricks, MySQL, Postgres, Oracle,  RDS, AWS, GCP)

  • Experience building data pipelines, models and infrastructure powering external, customer-facing (in addition to internal business facing) analytics applications.

  • Solid grasp to data warehousing methodologies like Kimball and Inmon

  • Experience working with a variety of ETL tools. (FiveTran, dbt, Python etc)

  • Experience with workflow orchestration management engines such as Airflow & Cloud Composer

  • Hands on experience with Data Infra tools like Kubernetes, Docker

  • Expert proficiency in SQL

  • Strong Python proficiency

  • Experience with ML Operations is a plus.

Recharge | Instagram | Twitter | Facebook

Recharge Payments is an equal opportunity employer. In addition to EEO being the law, it is a policy that is fully consistent with our principles. All qualified applicants will receive consideration for employment without regard to status as a protected veteran or a qualified individual with a disability, or other protected status such as race, religion, color, national origin, sex, sexual orientation, gender identity, genetic information, pregnancy or age. Recharge Payments prohibits any form of workplace harassment. 

Transparency in Coverage

This link leads to the Anthem Blue Cross machine-readable files that are made available in response to the federal Transparency in Coverage Rule and includes network negotiated rates for all items and services; allowed amounts for OON items, services and prescription drugs; and negotiated rates and historical prices for network prescription drugs (delayed). EIN 80-6245138. This link leads to the Kaiser machine-readable files.

#LI-Remote

See more jobs at ReCharge Payments

Apply for this job

14d

Director, Data Engineering

BrightcoveUS - Remote
agilescalanosqljavac++kubernetespython

Brightcove is hiring a Remote Director, Data Engineering

The Data Engineering team designs, develops and operates big data services and tools to provide reporting and analytics for our customers and for the business from concept to delivery. We do this on a massive scale, capturing billions of events a day from our video players across the globe. Our team is responsible for deploying to production while delivering differentiated value to our customers. We are seeking a seasoned leader and data practitioner with strong technical depth to drive engineering platforms, initiatives and partnerships across the company. In this role reporting to the SVP, Head of Data, you will be responsible for leading an organizational effort to build and scale state-of-the-art data platforms to solve various data-driven business use cases and needs. These data platforms will be powering data products and data initiatives across the company, including Video, Billing, Financials, Ad Monetization, Audience Insights, Product Analytics, Marketing, Customer Success and Personalization.

Job Responsibilities 

  • Hire, develop, motivate, inspire and lead a team of data engineers and architects
  • Develop, architect, propose, integrate and implement core data engineering, business intelligence, and data warehouse frameworks, systems, tools, roadmaps, goals, and best practices in support of key company data initiatives
  • Provide technical thought leadership to cross-functional stakeholders and leaders to ensure team priorities and output directly impacts company strategy, direction and growth
  • Provide technical and architectural oversight for systems and projects that are required to be reliable, massively scalable, highly available (99.999% uptime) and maintainable.
  • Champion security and governance and ensure data engineering team adheres to all company guidelines
  • Manage data infrastructure/cloud cost and responsible for improving efficiency and performance
  • Introduce best practices and principles to enable consistent delivery and enable alignment with long term direction.
  • Communicate effectively and present to all levels of the organization and to technical and non-technical audiences.
  • Develop deep partnership with cross-functional teams with varying backgrounds, from the highly technical to the highly creatives

Qualifications/Experience 

  • Bachelor’s degree in Engineering, Computer Science or similar discipline. Advanced degree is a plus.
  • 10+ years of experience in Data Engineering or Engineering fields with increasing responsibility and strong track record of technical competencies and business strategy, both as an individual contributor and managing teams.
  • 5+ years of experience in building, managing, and leading a high-performing data engineering team
  • Ability to develop and execute roadmaps and state-of-the-art data platforms to drive and support decisions and initiative across the organization
  • Track record of implementing and scaling secure data platform for startups, enterprise or large/public companies
  • Strengths in at least a few programming languages - Python, Java, Scala 
  • Expertise in building and managing large volume data processing (both streaming and batch) platforms
  • Expertise in scaling data environments with distributed/RT systems, e.g. Apache Spark, Flink, and self-serve visualization environments
  • Expertise in cloud based and SaaS data warehouse (e.g. AWS/GCP, Snowflake, NoSQL) and developing ETL/ELT pipelines
  • Experience integrating and building data platform in support of BI (Tableau/Looker), Analytics, Data Science, and real-time applications
  • Thrive on empowering team members and view the team's success as own
  • Expertise in stream processing systems such as Kafka, Kinesis, Pulsar or Similar is a plus
  • Experience in building micro services and managing containerized deployments, preferably using Kubernetes
  • Experience with agile methodologies
  • Video streaming experience across OTT, Live and Video on Demand platforms is highly desired
  • Experience with video streaming monetization platforms and data driven decision making for AVOD, SVOD, TVOD, FAST and Ad optimization is highly desired

About Brightcove 

Brightcove is a diverse, global team of smart, passionate people who are revolutionizing the way organizations deliver video. We’re hyped up about storytelling, and about helping organizations reach their audiences in bold and innovative ways. When video is done right, it can have a powerful and lasting effect. Hearts open. Minds change. 

Since 2004, Brightcove has been supporting customers that are some of the largest media companies, enterprises, events, and non-profit organizations in the world. There are over 600 Brightcovers globally, each of us representing our unique talents and we have built a culture that values authenticity, individual empowerment, excellence and collaboration. This culture enables us to harness the incredible power of video and create an environment where you will want to grow, stay and thrive. Bottom line: We take our video seriously, and we take great pride in doing it as #oneteam.

WORKING AT BRIGHTCOVE 

We strive to provide our employees with an environment where they can do their best work and be their best selves. This includes a focus on our employees’ work experience, actively creating a culture where inclusion and growth are at the center, and hiring, recognizing, promoting employees who are committed to living and breathing these same ideals.  

While remote work arrangements are available for most positions we also offer hybrid or on-site working options in our Boston office, located in beautiful Fort Point harbor. Employees enjoy access to fully-stocked kitchens and social activities including: happy hours, trivia, ping pong tournaments, and events and celebrations of all kinds. You will have plenty of opportunities to meet your colleagues around the globe as we also celebrate a variety of personal interests with organized groups and clubs including an Employee Action Committee, Women of Brightcove, Pride of Brightcove, Parents of Brightcove … and more!

We recognize that no candidate is perfect and Brightcove would love to have the chance to get to know you. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. Brightcove embraces diversity and seeks candidates who support persons of all identities and backgrounds. We strongly encourage individuals from underrepresented and/or marginalized identities to apply. If you need any accommodations for your interview, please emailrecruiting@brightcove.com

The BrightcovePrivacy Policyexplains the processing and purposes of any personal information.

#LI-Remote

BC21065

At Brightcove, we believe that providing comprehensive and competitive compensation and benefits packages across the globe are essential to our employees. Base salary is just one component of Brightcove’s total rewards program. We offer a wide range of benefits and perks that may include bonus or commission, Brightcove stock, unlimited paid time off, 401(K) matching, health insurance (medical, dental, and vision), generous employer Health Savings Account (HSA) contributions, tuition reimbursement, 100% paid parental leave and more.

USA Brightcove Base Salary Range
$182,400$273,600 USD

See more jobs at Brightcove

Apply for this job

14d

Senior Data Engineer

InvocaRemote
Bachelor's degreesqlsalesforceDesignswiftazureapic++postgresqlmysqlpythonAWS

Invoca is hiring a Remote Senior Data Engineer

About Invoca:

Invoca is the industry leader and innovator in AI and machine learning-powered Conversation Intelligence. With over 300 employees, 2,000+ customers, and $100M in revenue, there are tremendous opportunities to continue growing the business. We are building a world-class SaaS company and have raised over $184M from leading venture capitalists including Upfront Ventures, Accel, Silver Lake Waterman, H.I.G. Growth Partners, and Salesforce Ventures.

About the Engineering Team:

You’ll join a team where everyone, including you, is striving to constantly improve their knowledge of software development tools, practices, and processes. We are an incredibly supportive team. We swarm when problems arise and give excellent feedback to help each other grow. Working on our close-knit, multi-functional teams is a chance to share and grow your knowledge of different domains from databases to front ends to telephony and everything in between.

We are passionate about many things: continuous improvement, working at a brisk but sustainable pace, writing resilient code, maintaining production reliability, paying down technical debt, hiring fantastic teammates; and we love to share these passions with each other.

Learn more about the Invoca development team on our blog and check out our open source projects.

You Will:

Invoca offers a unique opportunity to make massive contributions to machine learning and data science as it applies to conversation intelligence, marketing, sales and user experience optimization.

You are excited about this opportunity because you get to:

  • Design and develop highly performant and scalable data storage solutions
  • Extend and enhance the architecture of Invoca’s data infrastructure and pipelines
  • Deploy and fine-tune machine learning models within an API-driven environment, ensuring scalability, efficiency, and optimal performance.
  • Expand and optimize our Extract, Transform, and Load (ETL) processes to include various structured and unstructured data sources within the Invoca Platform.
  • Evaluate and implement new technologies as needed and work with technical leadership to drive adoption.
  • Collaborate with data scientists, engineering teams, analysts, and other stakeholders to understand data requirements and deliver solutions on behalf of our customers
  • Support diversity, equity and inclusion at Invoca

At Invoca, our Senior Data Engineers benefit from mentorship provided by experts spanning our data science, engineering, and architecture teams. Our dedicated data science team is at the forefront of leveraging a blend of cutting-edge technology, including our proprietary and patented solutions, along with tools from leading vendors, to develop an exceptionally scalable data modeling platform.

Our overarching objective is to seamlessly deliver models through our robust API platform, catering to both internal stakeholders and external clients. Your pivotal role will focus on optimizing model accessibility and usability, thereby expediting model integration within our feature engineering teams. Ultimately, this streamlined process ensures swift model adoption, translating to enhanced value for our customers.

 

You Have:

We are excited about you because you have:

  • 3+ years of professional experience in Data Engineering or a related area of data science or software engineering
  • Advanced proficiency in Python, including expertise in data processing libraries (e.g.,  spaCy, Pandas), data visualization libraries (e.g., Matplotlib, Plotly), and familiarity with machine learning frameworks
  • Advanced proficiency using python API frameworks (e.g., FastAPI, Ray/AnyScale, AWS Sagemaker) to build, host. and optimize machine learning model inference APIs.
  • Intermediate proficiency working with the Databricks platform (e.g. Unity Catalog, Job/Compute, Deltalake) (or similar platform) for data engineering and analytics tasks
  • Intermediate proficiency working with the Machine Learning and Large Language Model (LLM) tools from AWS (e.g., Sagemaker, Bedrock) or other cloud vendors such Azure, or Google Cloud Platform
  • Intermediate proficiency with big data technologies and frameworks  (e.g., Spark, Hadoop)
  • Intermediate proficiency with SQL and relational databases (e.g., MySQL, PostgreSQL)
  • Basic proficiency in several areas apart from pure coding, such as monitoring, performance optimization, integration testing, security and more
  • Basic proficiency with Kafka (or similar stream-processing software) is a plus
  • Bachelor's Degree or equivalent experience preferred

Salary, Benefits & Perks:

Teammates begin receiving benefits on the first day of the month following or coinciding with one month of employment. Offerings include:

  • Paid Time Off -Invoca encourages a work-life balance for our employees. We have an outstanding PTO policy starting at 20 days off for all full-time employees. We also offer 16 paid holidays, 10 days of Compassionate Leave, days of volunteer time, and more.
  • Healthcare -Invoca offers a healthcare program that includes medical, dental, and vision coverage. There are multiple plan options to choose from. You can make the best choice for yourself, your partner, and your family.
  • Retirement - Invoca offers a 401(k) plan through Fidelity with a company match of up to 4%.
  • Stock options - All employees are invited to ownership in Invoca through stock options.
  • Employee Assistance Program -Invoca offers well-being support on issues ranging from personal matters to everyday-life topics through the WorkLifeMatters program.
  • Paid Family Leave -Invoca offers up to 6 weeks of 100% paid leave for baby bonding, adoption, and caring for family members.
  • Paid Medical Leave - Invoca offers up to 12 weeks of 100% paid leave for childbirth and medical needs.
  • Sabbatical -We thank our long-term team members with an additional week of PTO and a bonus after 7 years of service.
  • Wellness Subsidy - Invoca provides a wellness subsidy applicable to a gym membership, fitness classes, and more.
  • Position Base Range - $139,500 to $175,000Salary Range / plus bonus potential
  • Please note, per Invoca's COVID-19 policy, depending on your vaccine verification status, you may be required to work only from home / remotely. At this time, travel and in-person meetings will require verification. This policy is regularly reviewed and subject to change at any time

Recently, we’ve noticed a rise in phishing attempts targeting individuals who are applying to our job postings. These fraudulent emails, posing as official communications from Invoca aim to deceive individuals into sharing sensitive information. These attacks have attempted to use our name and logo, and have tried to impersonate individuals from our HR team by claiming to represent Invoca. 

We will never ask you to send financial information or other sensitive information via email. 

 

DEI Statement

We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity workplace.

#LI-Remote

See more jobs at Invoca

Apply for this job

17d

Data Engineer

Alliance Animal HealthClaremore, OK, Remote
tableausqlDesignAWS

Alliance Animal Health is hiring a Remote Data Engineer

Job Description

Does managing data hygiene make you smile? Does architecting and loading a wide set of data sources to a single source of truth warehouse make your heart sing? Do you love animals? Then this is the role for you!  

This is the perfect opportunity for someone to drive and lead our data ecosystem for the company. This individual will partner closely with stakeholders across the organization and through external vendors to develop our data environment that powers and creates a single source of truth for all departments.   

Your duties would include identifying our internal and external data sources, collaborating with department heads to determine their data storage and organizational needs, and using the information to create and maintain data infrastructure. You will be developing, optimizing, and maintaining our data systems including implementing APIs for accurate data extraction, ensuring data security, monitoring the system, ensuring data cleanliness and hygiene, maintaining our data dictionary, and supervising system migrations.  

The ideal candidate will thrive in a fast-paced, entrepreneurial environment with a no-job-too-big or small attitude. They will have a mix of inquisitiveness, learning agility, emotional intelligence, and drive for success.  

Key Responsibilities 

  • Familiar with data integration and ETL pipelines using Redshift/S3 and data lake best practices 

  • Outlining data flows, i.e., which parts of the organization generate data, which require data to function, how data flows are managed, and how data changes in transition 

  • Overseeing the integration of new technologies and initiatives into data standards and structures  

  • Translating business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses 

  • Evaluating the design, selection, and implementation of new databases or database changes and recommending optimal data solutions for the organization 

  • Creating systems to keep data secure and ensuring data and information security by integrating and upholding digital security systems  

  • Designing, developing, and modifying data infrastructure to accelerate the processes of data analysis and reporting  

  • Publishing and/or presenting reports, timeline updates, or recommendations 

  • Identifying areas for improvement in current systems 

  • Coordinating with other team members to reach project milestones and deadlines 

  • Auditing databases regularly to maintain quality 

  • Ingest new practices to our portfolio to map and connect them appropriately to our data environment 

  • Educate staff members through training and individual support 

Qualifications

Desired experience 

  • 2+ years of experience in a data architecture and/or engineering  

  • Strong competence in/with database technologies: AWS Redshift, Snowflake, SQL Server, ETL (AWS), Amazon S3, SQL, Python. 

  • Data Visualization and Reporting experience: Tableau  

  • Preferred experience in working with 3rd parties who manage MDM layers 

  • Preferred prior experience in a startup environment or client-facing/consulting experience  

This role must be familiar with database design and systems, database technology, and logical data analysis. To succeed, you will also need additional skills and qualifications, including:  

  • Excellent communication skills to translate complex problems using non-technical terms  

  • In-depth understanding of modern database and information technologies  

  • Excellent time management skills and the ability to work towards meeting multiple deadlines simultaneously  

  • Hands-on aptitude with a willingness to troubleshoot and solve complex problems 

  • Intellectual curiosity to discover new approaches and insights 

  • Ability to collaborate cross functionality and function effectively as an individual leader 

  • Strong attention to detail 

See more jobs at Alliance Animal Health

Apply for this job

20d

Principal Data Engineer

GeminiRemote (USA)
remote-firstnosqlairflowsqlDesigncsspythonjavascript

Gemini is hiring a Remote Principal Data Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Analytics

The Role: Principal Data Engineer

As a member of our data engineering team, you'll be setting standards for data engineering solutions that have organizational impact. You'll provide Architectural solutions that are efficient, robust, extensible and are competitive within business and industry context. You'll collaborate with senior data engineers and analysts, guiding them towards their career goals at Gemini. Communicating your insights with leaders across the organization is paramount to success.

Responsibilities:

  • Focused on technical leadership, defining patterns and operational guidelines for their vertical(s)
  • Independently scopes, designs, and delivers solutions for large, complex challenges
  • Provides oversight, coaching and guidance through code and design reviews
  • Designs for scale and reliability with the future in mind. Can do critical R&D
  • Successfully plans and delivers complex, multi-team or system, long-term projects, including ones with external dependencies
  • Identifies problems that need to be solved and advocates for their prioritization
  • Owns one or more large, mission-critical systems at Gemini or multiple complex, team level projects, overseeing all aspects from design through implementation through operation
  • Collaborates with coworkers across the org to document and design how systems work and interact
  • Leads large initiatives across domains, even outside their core expertise. Coordinates large initiatives
  • Designs, architects and implements best-in-class Data Warehousing and reporting solutions
  • Builds real-time data and reporting solutions
  • Develops new systems and tools to enable the teams to consume and understand data more intuitively

Minimum Qualifications:

  • 10+ years experience in data engineering with data warehouse technologies
  • 10+ years experience in custom ETL design, implementation and maintenance
  • 10+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Advanced skills with Python and SQL are a must
  • Experience and expertise in Databricks, Spark, Hadoop etc.
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
  • Strong computer science fundamentals including data structures and algorithms
  • Strong software engineering skills in any server side language, preferable Python
  • Experienced in working collaboratively across different teams and departments
  • Strong technical and business communication skills

Preferred Qualifications:

  • Kafka, HDFS, Hive, Cloud computing, machine learning, LLMs, NLP & Web development experience is a plus
  • NoSQL experience a plus
  • Deep knowledge of Apache Airflow
  • Expert experience implementing complex, enterprise-wide data transformation and processing solutions
  • Experience with Continuous integration and deployment
  • Knowledge and experience of financial markets, banking or exchanges
  • Web development skills with HTML, CSS, or JavaScript
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $172,000 - $215,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-AH1

Apply for this job

25d

Senior Data Engineer

HandshakeRemote (USA)
sqlDesignc++docker

Handshake is hiring a Remote Senior Data Engineer

Everyone is welcome at Handshake. We know diverse teams build better products and we are committed to creating an inclusive culture built on a foundation of respect for all individuals. We strongly encourage candidates from non-traditional backgrounds, historically marginalized or underrepresented groups to apply.

Your impact

At Handshake, we are assembling a team of dynamic engineers who are passionate about creating high-quality, impactful products. As a Senior Data Engineer, you will play a key role in driving the architecture, implementation, and evolution of our cutting-edge data platform. Your technical expertise will be instrumental in helping millions of students discover meaningful careers, irrespective of their educational background, network, or financial resources.

Our primary focus is on building a robust data platform that empowers all teams to develop data-driven features while ensuring that every facet of the business has access to the right data for making informed conclusions. While this individual will work closely in collaboration with our ML teams, they will also be supporting our businesses data needs as a whole.

Your role

  • Technical leadership: Taking ownership of the data engineering function and providing technical guidance to the data engineering team. Mentoring junior data engineers, fostering a culture of learning, and promoting best practices in data engineering.
  • Collaborating with cross-functional teams: Working closely with product managers, product engineers, and other stakeholders to define data requirements, design data solutions, and deliver high-quality, data-driven features.
  • Data architecture and design: Designing and implementing scalable and robust data pipelines, data services, and data products that meet business needs and adhere to best practices. Staying abreast of emerging technologies and tools in the data engineering space, evaluating their potential impact on the data platform, and making strategic recommendations.
  • Performance optimization: Identifying performance bottlenecks in data processes and implementing solutions to enhance data processing efficiency.
  • Data quality and governance: Ensuring data integrity, reliability, and security through the implementation of data governance policies and data quality monitoring.
  • Advancing our Generative AI strategy: Leveraging your Data Engineering knowledge to design and implement data pipelines that support our Generative AI initiatives, advising and working in collaboration with our ML teams.

Your experience

  • Extensive data engineering experience: A proven track record in designing and implementing large-scale, complex data pipelines, data warehousing solutions, and data services. Deep knowledge of data engineering technologies, tools, and frameworks.
  • Cloud platform proficiency: Hands-on experience with cloud-based data technologies, preferably Google Cloud Platform (GCP), including BigQuery, DataFlow, BigTable, and more
  • Advanced SQL skills: Strong expertise in SQL and experience with data modeling and database design conventions.
  • Problem-solving abilities: Exceptional problem-solving skills, with the ability to tackle complex data engineering challenges and propose innovative solutions.
  • Collaborative mindset: A collaborative and team-oriented approach to work, with the ability to communicate effectively with both technical and non-technical stakeholders.

Bonus areas of expertise

  • Machine learning for data enrichment: Experience in applying machine learning techniques to data engineering tasks for data enrichment and augmentation.
  • End to end data service deployment, comfortable with product alignment of data-driven initiatives
  • Containerization and orchestration: Familiarity with containerization technologies like Docker and container orchestration platforms like Kubernetes.
  • dbt: Experience with dbt as a data transformation tool for orchestrating and organizing data pipelines.

Compensation range

$173,000-$213,580

For cash compensation, we set standard ranges for all U.S.-based roles based on function, level, and geographic location, benchmarked against similar stage growth companies. In order to be compliant with local legislation, as well as to provide greater transparency to candidates, we share salary ranges on all job postings regardless of desired hiring location. Final offer amounts are determined by multiple factors, including geographic location as well as candidate experience and expertise, and may vary from the amounts listed above.

About us

Handshake is the #1 place to launch a career with no connections, experience, or luck required. The platform connects up-and-coming talent with 750,000+ employers - from Fortune 500 companies like Google, Nike, and Target to thousands of public school districts, healthcare systems, and nonprofits. In 2022 we announced our $200M Series F funding round. This Series F fundraise and valuation of $3.5B will fuel Handshake’s next phase of growth and propel our mission to help more people start, restart, and jumpstart their careers.

When it comes to our workforce strategy, we’ve thought deeply about how work-life should look here at Handshake. With our Hub-Based Remote Working strategy, employees can enjoy the flexibility of remote work, whilst ensuring collaboration and team experiences in a shared space remains possible. Handshake is headquartered in San Francisco with offices in Denver, New York, London, and Berlin and teammates working globally. 

Check out our careers site to find a hub near you!

What we offer

At Handshake, we'll give you the tools to feel healthy, happy and secure.

Benefits below apply to employees in full-time positions.

  • ???? Equity and ownership in a fast-growing company.
  • ???? 16 Weeks of paid parental leave for birth giving parents & 10 weeks of paid parental leave for non-birth giving parents.
  • ???? Comprehensive medical, dental, and vision policies including LGTBQ+ Coverage. We also provide resources for Mental Health Assistance, Employee Assistance Programs and counseling support.
  • ???? Handshake offers $500/£360 home office stipend for you to spend during your first 3 months to create a productive and comfortable workspace at home.
  • ???? Generous learning & development opportunities and an annual $2,000/£1,500/€1,850 stipend for you to grow your skills and career.
  • ???? Financial coaching through Origin to help you through your financial journey.
  • ???? Monthly internet stipend and a brand new MacBook to allow you to do your best work.
  • ???? Monthly commuter stipend for you to expense your travel to the office (for office-based employees).
  • ???? Free lunch provided twice a week across all offices.
  • ???? Referral bonus to reward you when you bring great talent to Handshake.

(US-specific benefits, in addition to the first section)

  • ???? 401k Match: Handshake offers a dollar-for-dollar match on 1% of deferred salary, up to a maximum of $1,200 per year.
  • ???? All full-time US-based Handshakers are eligible for our flexible time off policy to get out and see the world. In addition, we offer 8 standardized holidays, and 2 additional days of flexible holiday time off. Lastly, we have a Winter #ShakeBreak, a one-week period of Collective Time Off.
  • ???? Lactation support: Handshake partners with Milk Stork to provide a comprehensive 100% employer-sponsored lactation support to traveling parents and guardians.

(UK-specific benefits, in addition to the first section) 

  • ???? Pension Scheme: Handshake will provide you with a workplace pension, where you will make contributions based on 5% of your salary. Handshake will pay the equivalent of 3% towards your pension plan, subject to qualifying earnings limits.
  • ???? Up to 25 days of vacation to encourage people to reset, recharge, and refresh, in addition to 8 bank holidays throughout the year.
  • ???? Regular offsites each year to bring the team together + opportunity to travel to our HQ in San Francisco.
  • ????️ Discounts across various high street retailers, cinemas and other social activities exclusively for Handshake UK employees.

(Germany-specific benefits, in addition to the first section)

  • ???? 25 days of annual leave + we have a Winter #ShakeBreak, a one-week period of Collective Time Off across the company.
  • ???? Regular offsites each year to bring the team together + opportunity to travel to our HQ in San Francisco once a year.
  • ???? Urban sports club membership offering access to a diverse network of fitness and wellness facilities.
  • ????️ Discounts across various high street retailers, cinemas and other social activities exclusively for Handshake Germany employees.

For roles based in Romania: Please ask your recruiter about region specific benefits.

Looking for more? Explore our mission, values and comprehensive US benefits at joinhandshake.com/careers.

Handshake is committed to providing reasonable accommodations in our recruitment processes for candidates with disabilities, sincerely held religious beliefs or other reasons protected by applicable laws. If you need assistance or reasonable accommodation, please reach out to us at people-hr@joinhandshake.com.

See more jobs at Handshake

Apply for this job

26d

Staff Data Engineer

Procore TechnologiesBangalore, India, Remote
scalaairflowsqlDesignUXjavakubernetespython

Procore Technologies is hiring a Remote Staff Data Engineer

Job Description

We’re looking for a Staff Data Engineer to join Procore’s Data Division. In this role, you’ll help build Procore’s next-generation construction data platform for others to build upon including Procore developers, analysts, partners, and customers. 

As a Staff Data Engineer, you’ll partner with other engineers and product managers across Product & Technology to develop data platform capabilities that enable the movement, transformation, and retrieval of data for use in analytics, machine learning, and service integration. To be successful in this role, you’re passionate about distributed systems including storage, streaming, and batch data processing technologies on the cloud, with a strong bias for action and outcomes. If you’re a seasoned data engineer comfortable and excited about building our next-generation data platform and translating problems into pragmatic solutions that open up the boundaries of technical possibilities—we’d love to hear from you!

This is a full-time position and will report to our Senior Manager of Software Engineering and will be based in the India office, but employees can choose to work remotely. We are looking for someone to join our team immediately.

What you’ll do: 

  • Participate in the design and implementation of our next-generation data platform for the construction industry
  • Define and implement operational and dimensional data models and transformation pipelines to support reporting and analytics
  • Actively participate with our engineering team in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing and roll-out, and support
  • Understand our current data models and infrastructure, proactively identify areas for improvement, and prescribe architectural recommendations with a focus on performance and accessibility. 
  • Work alongside our Product, UX, and IT teams, leveraging your expertise in the data space to influence our product roadmap, developing innovative solutions that add additional value to our platform
  • Help uplevel teammates by conducting code reviews, providing mentorship, pairing, and training opportunities
  • Stay up to date with the latest data technology trends

What we’re looking for: 

  • Bachelor’s Degree in Computer Science or a related field is preferred, or comparable work experience 
  • 8+ years of experience building and operating cloud-based, highly available, and scalable data platforms and pipelines supporting vast amounts of data for reporting and analytics
  • 2+ years of experience building data warehouses in Snowflake or Redshift
  • Hands-on experience with MPP query engines like Snowflake, Presto, Dremio, and Spark SQL
  • Expertise in relational, dimensional data modeling.
  • Understanding of data access patterns, streaming technology, data validation, performance optimization, and cost optimization
  • Strength in commonly used data technologies and languages such as Python, Java or Scala, Kafka, Spark, Flink, Airflow, Kubernetes, or similar
  • Strong passion for learning, always open to new technologies and ideas

Qualifications

See more jobs at Procore Technologies

Apply for this job

26d

Principal Data Engineer

Procore TechnologiesBangalore, India, Remote
scalanosqlairflowDesignazureUXjavadockerpostgresqlkubernetesjenkinspythonAWS

Procore Technologies is hiring a Remote Principal Data Engineer

Job Description

We’re looking for a Principal Data Engineer to join Procore’s Data Division. In this role, you’ll help build Procore’s next-generation construction data platform for others to build upon including Procore developers, analysts, partners, and customers. 

As a Principal Data Engineer, you’ll use your expert-level technical skills to craft innovative solutions while influencing and mentoring other senior technical leaders. To be successful in this role, you’re passionate about distributed systems, including caching, streaming, and indexing technologies on the cloud, with a strong bias for action and outcomes. If you’re an inspirational leader comfortable translating vague problems into pragmatic solutions that open up the boundaries of technical possibilities—we’d love to hear from you!

This position reports to the Senior Manager, Reporting and Analytics. This position can be based in our Bangalore, Pune, office or work remotely from a India location. We’re looking for someone to join us immediately.

What you’ll do: 

  • Design and build the next-generation data platform for the construction industry
  • Actively participate with our engineering team in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing and roll-out, and support
  • Contribute to setting standards and development principles across multiple teams and the larger organization
  • Stay connected with other architectural initiatives and craft a data platform architecture that supports and drives our overall platform
  • Provide technical leadership to efforts around building a robust and scalable data pipeline to support billions of events
  • Help identify and propose solutions for technical and organizational gaps in our data pipeline by running proof of concepts and experiments working with Data Platform Engineers on implementation
  • Work alongside our Product, UX, and IT teams, leveraging your experience and expertise in the data space to influence our product roadmap, developing innovative solutions that add additional capabilities to our tools

What we’re looking for: 

  • Bachelor’s degree in Computer Science, a similar technical field of study, or equivalent practical experience is required; MS or Ph.D. degree in Computer Science or a related field is preferred
  • 10+ years of experience building and operating cloud-based, highly available, and scalable online serving or streaming systems utilizing large, diverse data sets in production
  • Expertise with diverse data technologies like Databricks, PostgreSQL, GraphDB, NoSQL DB, Mongo, Cassandra, Elastic Search, Snowflake, etc.
  • Strength in the majority of commonly used data technologies and languages such as Python, Java or Scala, Kafka, Spark, Airflow, Kubernetes, Docker, Argo, Jenkins, or similar
  • Expertise with all aspects of data systems, including ETL, aggregation strategy, performance optimization, and technology trade-off
  • Understanding of data access patterns, streaming technology, data validation, data modeling, data performance, cost optimization
  • Experience defining data engineering/architecture best practices at a department and organizational level and establishing standards for operational excellence and code and data quality at a multi-project level
  • Strong passion for learning, always open to new technologies and ideas
  • AWS and Azure experience is preferred

Qualifications

See more jobs at Procore Technologies

Apply for this job

29d

Data Engineer PySpark AWS

2 years of experienceagileBachelor's degreejiraterraformscalaairflowpostgressqloracleDesignmongodbjavamysqljenkinspythonAWS

FuseMachines is hiring a Remote Data Engineer PySpark AWS

Data Engineer PySpark AWS - Fusemachines - Career PageSee more jobs at FuseMachines

Apply for this job

29d

Senior Data Engineer

DevoteamTunis, Tunisia, Remote
airflowsqlscrum

Devoteam is hiring a Remote Senior Data Engineer

Description du poste

Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

Votre rôle consistera à contribuer à des projets data en apportant son expertise sur les tâches suivantes :

  • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur GCP, en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
  • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
  • Optimiser les performances des requêtes SQL et des processus ETL pour garantir des temps de réponse rapides et une scalabilité.
  • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
  • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
  • Restez à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

Qualifications

  • Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.
  • Au moins 3 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
  • Certification GCP (Google Cloud Platform) est un plus.
  • Très bonne communication écrite et orale (livrables et reportings de qualité)

See more jobs at Devoteam

Apply for this job

29d

Senior Data Engineer

BrazeRemote - Ontario
Bachelor's degreesqlDesign

Braze is hiring a Remote Senior Data Engineer

At Braze, we have found our people. We’re a genuinely approachable, exceptionally kind, and intensely passionate crew.

We seek to ignite that passion by setting high standards, championing teamwork, and creating work-life harmony as we collectively navigate rapid growth on a global scale while striving for greater equity and opportunity – inside and outside our organization.

To flourish here, you must be prepared to set a high bar for yourself and those around you. There is always a way to contribute: Acting with autonomy, having accountability and being open to new perspectives are essential to our continued success. Our deep curiosity to learn and our eagerness to share diverse passions with others gives us balance and injects a one-of-a-kind vibrancy into our culture.

If you are driven to solve exhilarating challenges and have a bias toward action in the face of change, you will be empowered to make a real impact here, with a sharp and passionate team at your back. If Braze sounds like a place where you can thrive, we can’t wait to meet you.

WHAT YOU’LL DO

Team Overview:

Join our dynamic team dedicated to revolutionizing data analytics for impactful decision-making. The team collaboratively shapes data strategies, optimizing analytics practices to drive business growth.

Responsibilities:

  • Lead the design, implementation, and monitoring of large-scale data warehouses.
  • Excel in SQL with proficiency in window functions, STRUCT/ARRAY manipulation, and query optimization.
  • Dive deep into product understanding, team roadmaps, technical architecture, and data flow.
  • Mentor data-savvy stakeholders on data best practices.
  • Expertly design data models (Snowflake, Star, Data Vault 2.0) for clean data structures.
  • Track downstream usage and feedback for continuous improvement.
  • Adhere to and promote performance best practices based on specific database engine requirements.
  • Embrace a passion for data cataloging, metadata management, and adherence to data governance principles.
  • Design systems with a test-driven approach for trapping bad-quality data and highlighting alerts.
  • Utilize tools like dbt for building efficient data transformation pipelines.
  • Partner effectively with engineering, data analysts, data scientists, and business stakeholders.

WHO YOU ARE

The ideal candidate for this role possesses:

  • 4+ years of hands-on experience in Snowflake and other cloud data warehouses.
  • Proven expertise in SQL, data modeling, and data governance principles using dbt.
  • A track record of leading impactful data projects.
  • Effective collaboration skills with cross-functional teams.
  • In-depth understanding of technical architecture and data flow.
  • Ability to mentor and guide stakeholders on data best practices.
  • Proficiency in schema design (Snowflake, Star, Data Vault 2.0) and large-scale data warehouse implementation.
  • Passion for clean data structures and continuous improvement.
  • Strong analytical and problem-solving skills.
  • Enthusiasm for SQL frameworks like dbt.
  • Expertise in owning and managing dbt projects.
  • Dedication to applying data governance principles at scale.

#LI-REMOTE

WHAT WE OFFER

Details of these benefits plan will be provided if a candidate receives an offer of employment. Benefits may vary by location.

From offering comprehensive benefits to fostering flexible environments, we’ve got you covered so you can prioritize work-life harmony.

  • Competitive compensation that may include equity
  • Retirement and Employee Stock Purchase Plans
  • Flexible paid time off
  • Comprehensive benefit plans covering medical, dental, vision, life, and disability
  • Family services that include fertility benefits and equal paid parental leave
  • Professional development supported by formal career pathing, learning platforms, and tuition reimbursement
  • Community engagement opportunities throughout the year, including an annual company wide Volunteer Week
  • Employee Resource Groups that provide supportive communities within Braze
  • Collaborative, transparent, and fun culture recognized as a Great Place to Work®

ABOUT BRAZE

Braze is a leading customer engagement platform that powers lasting connections between consumers and brands they love. Braze allows any marketer to collect and take action on any amount of data from any source, so they can creatively engage with customers in real time, across channels from one platform. From cross-channel messaging and journey orchestration to Al-powered experimentation and optimization, Braze enables companies to build and maintain absolutely engaging relationships with their customers that foster growth and loyalty.

Braze is proudly certified as a Great Place to Work® in the U.S., the UK and Singapore. We ranked #3 on Great Place to Work UK’s 2024 Best Workplaces (Large), #3 on Great Place to Work UK’s 2023 Best Workplaces for Wellbeing (Medium), #4 on Great Place to Work’s 2023 Best Workplaces in Europe (Medium), #10 on Great Place to Work UK’s 2023 Best Workplaces for Women (Large), #19 on Fortune’s 2023 Best Workplaces in New York (Large). We were also featured in Built In's 2024 Best Places to Work, U.S. News Best Technology Companies to Work For, and Great Place to Work UK’s 2023 Best Workplaces in Tech.

You’ll find many of us at headquarters in New York City or around the world in Austin, Berlin, Chicago, Jakarta, London, Paris, San Francisco, Singapore, Sydney and Tokyo – not to mention our employees in nearly 50 remote locations.

BRAZE IS AN EQUAL OPPORTUNITY EMPLOYER

At Braze, we strive to create equitable growth and opportunities inside and outside the organization.

Building meaningful connections is at the heart of everything we do, and that includes our recruiting practices. We're committed to offering all candidates a fair, accessible, and inclusive experience – regardless of age, color, disability, gender identity, marital status, national origin, race, religion, sex, sexual orientation, or status as a protected veteran. When applying and interviewing with Braze, we want you to feel comfortable showcasing what makes you you.

We know that sometimes different circumstances can lead talented people to hesitate to apply for a role unless they meet 100% of the criteria. If this sounds familiar, we encourage you to apply, as we’d love to meet you.

Please see ourCandidate Privacy Policy for more information on how Braze processes your personal information during the recruitment process and, if applicable based on your location, how you can exercise any privacy rights.

See more jobs at Braze

Apply for this job

+30d

Data Engineer

phDataLATAM - Remote
scalasqlazurejavapythonAWS

phData is hiring a Remote Data Engineer

Job Application for Data Engineer at phData

See more jobs at phData

Apply for this job