Data Engineer Remote Jobs

107 Results

18d

Senior Civil Engineer - Data Center

OlssonGreenville, SC, Remote
Bachelor's degreeDesign

Olsson is hiring a Remote Senior Civil Engineer - Data Center

Job Description

As a Senior Civil Engineer on our Data Center Civil Team, you will be a part of the firm’s largest and most complex projects. You will serve as a project manager on some projects and lead design engineer on others. Prepare planning and design documents, process design calculations, and develop and maintain team and client standards. You may lead quality assurance/quality control and act as an advisor on complex projects. You will also coordinate with other Olsson teams, professional staff, technical staff, clients, and other consultants.

You may travel to job sites for observation and attend client meetings.

*This role offers flexible work options, including remote and hybrid opportunities, to accommodate diverse working preferences and promote work-life balance. Candidates can work hybrid schedules, work remotely, or work out of any Olsson office location in these regions/areas.

Qualifications

You are passionate about:

  • Working collaboratively with others
  • Having ownership in the work you do
  • Using your talents to positively affect communities
  • Solving problems
  • Providing excellence in client service

You bring to the team:

  • Strong communication skills
  • Ability to contribute and work well on a team
  • Bachelor's Degree in civil engineering
  • At least 8 years of related civil engineering experience
  • Proficient in Civil 3D software
  • Must be a registered professional engineer

See more jobs at Olsson

Apply for this job

18d

Data Engineer (H/F)

CITECHParis, France, Remote
sqlansiblegitpythonPHP

CITECH is hiring a Remote Data Engineer (H/F)

Description du poste

???????? Vous aurez donc les missions principales suivantes : ????????

???? Support de l'application (notamment lors des clôtures mensuelles).
???? Participer à la maintenance évolutive.
???? Participer à la conception et l'implémentation de nouvelles fonctionnalités.
???? Participer à la refonte technique.
???? Participer à la migration de Talend vers Spark/scala.

Qualifications

????De formation supérieure en informatique, vous justifiez de 5 années d’expérience minimum sur un poste similaire.   

⚙️ Les compétences attendues sont les suivantes :

✔️ Vous maîtrisez Spark, Talend (Data Intégration, Big Data) et Scala.
✔️Vous avez des compétences en développement (Shell unix, Perl, PHP, Python, git, github).
✔️Vous avez des compétences sur l’environnement technique suivant :
 Hadoop (Big Data), Hive, Microsoft PowerBI, Microsoft SQLServer Analysis services (Olap), Integration services, Reporting services, Scripting (GuitHub, Ansible, AWX, shell, vba) et SQL Server Database.

See more jobs at CITECH

Apply for this job

19d

Senior Data Engineer

QAD, Inc.Barcelona, Spain, Remote
Bachelor's degree10 years of experienceterraformmariadbsqlDesignansiblemongodbpostgresqlmysqlpythonAWS

QAD, Inc. is hiring a Remote Senior Data Engineer

Job Description

We are seeking a highly skilled and experienced Senior Database Administrator (DBA) with expertise in managing and optimizing MariaDB databases in Amazon RDS environments, along with a strong background in database migration and conversion between different database technologies. The ideal candidate should possess deep expertise in database administration, performance tuning, troubleshooting, automation, and the ability to seamlessly migrate data between various database platforms.

What you’ll do:

  • Manage and administer MariaDB databases deployed on Amazon RDS, ensuring high availability, security, and performance.
  • Monitor and proactively respond to database alerts, incidents, and performance issues on Amazon RDS to minimize downtime and optimize performance.
  • Collaborate with cross-functional teams to design and implement database solutions that meet application requirements and scalability needs on Amazon RDS.
  • Conduct regular database capacity planning on Amazon RDS and provide recommendations for resource optimization.
  • Develop and implement backup, recovery, and disaster recovery strategies on Amazon RDS to ensure data integrity and availability.
  • Perform data migration and conversion between different database technologies, ensuring data accuracy, consistency, and minimal disruption to services.
  • Work on database schema design and optimization, including indexing, partitioning, and data modeling, on Amazon RDS.
  • Collaborate with developers to optimize query performance, troubleshoot slow-running queries, and suggest query optimization techniques on Amazon RDS.
  • Implement and maintain database security practices on Amazon RDS, including user access control, role management, and data encryption.
  • Maintain documentation for database configurations, procedures, migration strategies, and troubleshooting guides on Amazon RDS.
  • Automate routine database tasks using scripting and configuration management tools for Amazon RDS.
  • Stay up-to-date with the latest developments in MariaDB, Amazon RDS, and other relevant database technologies, and apply new knowledge to enhance database operations and migrations.

Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or a related field; relevant certifications are a plus.
  • Minimum of 10 years of experience as a Database Administrator, with a strong focus on MariaDB in production environments.
  • Strong expertise in managing MariaDB databases on Amazon RDS, including deployment, configuration, and optimization.
  • Proficiency in SQL query optimization, performance tuning, and troubleshooting on Amazon RDS.
  • Solid understanding of database security principles and best practices on Amazon RDS.
  • Experience with database backup, recovery, and replication strategies on Amazon RDS.
  • Proven experience in successful database migration and conversion projects between different database technologies.
  • Familiarity with automation and configuration management tools (e.g., Ansible, Terraform, CloudFormation) for Amazon RDS.
  • Strong scripting skills (e.g., Bash, Python) for automating database tasks on Amazon RDS.
  • Excellent problem-solving skills and the ability to diagnose and resolve complex database issues on Amazon RDS.
  • Knowledge of cloud computing concepts and experience with AWS services.
  • Strong communication skills and the ability to collaborate effectively with cross-functional teams.
  • Demonstrated ability to work independently, prioritize tasks, and manage time efficiently.
  • Experience with SkySQL is also a plus.
  • Experience with other database systems (e.g., MySQL, PostgreSQL, Progress Db, MongoDb, Cassandra) is a plus.
  • Knowledge of DevOps practices and CI/CD pipelines is advantageous.
  • Strong written and verbal English language skills.

See more jobs at QAD, Inc.

Apply for this job

20d

Data Engineer

PrismHRRemote
golangMaster’s Degreescalarubyc++

PrismHR is hiring a Remote Data Engineer

Data Engineer - PrismHR - Career PageLeveling up our platform, including enhancing our automation, test coverage, observability, alert

See more jobs at PrismHR

Apply for this job

20d

Sr Big Data Engineer

Ingenia AgencyMexico Remote
Bachelor's degreesqloraclepython

Ingenia Agency is hiring a Remote Sr Big Data Engineer

At Ingenia Agency we’re looking for a Data Engineerto join our team.

Responsible for creating and sustaining pipelines that allow for the analysis of data.

What will you be doing?

  • Conceptualizing and generating infrastructure that allows data to be accessed and analyzed in a global setting.
  • Load raw data from our SQL Servers, manipulate and save it into Google Cloud databases.
  • Detecting and correcting errors in data and writing scripts to clean such data up.
  • Work with scientists and clients in the business to gather requirements and ensure easy flow of data.

What are we looking for?

  • Age indifferent.
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • Master's degree in a relevant field is advantageous.
  • Proven experience as a Data Engineer.
  • Expert proficiency in Python, ETL and SQL.
  • Familiarity with Google Cloud/ AWS/Azure or suitable equivalent.
  • Excellent analytical and problem-solving skills.
  • A knack for independent and group work.
  • Knowledge of Oracle and MDM Hub.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Advanced English.
  • Be Extraordinary!

What are we offering?

  • Competitive salary
  • Contract for specific period of time
  • Law benefits:
    • 10 days of vacations to the first year fulfilled
    • IMSS
  • Additional benefits:
    • Contigo Membership (Insurance of minor medical expenses)
      • Personal accident policy.
      • Funeral assistance.
      • Dental and visual health assistance.
      • Emotional wellness.
      • Benefits & discounts.
      • Network of medical services and providers with a discount.
      • Medical network with preferential prices.
      • Roadside assistance with preferential price, among others.
    • 3 special permits a year, to go out to any type of procedure that you have to do half day equivalent
    • Half day off for birthdays
    • 5 days of additional vacations in case of marriage
    • 50% scholarship in language courses in the Anglo
    • Percentage scholarship in the study of graduates or masters with the Tec. de Mty.
    • Agreement with ticket company for preferential rates for events of entertainment.



See more jobs at Ingenia Agency

Apply for this job

20d

Sr Data Engineer GCP

Ingenia AgencyMexico Remote
Bachelor's degree5 years of experience3 years of experienceairflowsqlapipython

Ingenia Agency is hiring a Remote Sr Data Engineer GCP


AtIngenia Agency we’re looking for a Sr Data Engineer to join our team.

Responsible for creating and sustaining pipelines that allow for the analysis of data.

What will you be doing?

  • Sound understanding of Google Cloud Platform.
  • Should have worked on Big Query, Workflow or Composer.
  • Should know how to reduce BigQuery costs by reducing the amount of data processed by the queries.
  • Should be able to speed up queries by using denormalized data structures, with or without nested repeated fields.
  • Exploring and preparing data using BigQuery.
  • Experience in delivering artifacts scripts Python, dataflow components, SQL, Airflow and Bash/Unix scripting.
  • Building and productionizing data pipelines using dataflow.

What are we looking for?

  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • Age indifferent.
  • 3 to 5 years of experience in GCP is required.
  • Must have Excellent GCP, Big Query and SQL skills.
  • Should have at least 3 years of experience in BigQuery Dataflow and Experience with Python and Google Cloud SDK API Scripting to create reusable framework.
  • Candidate should have strong hands-on experience in PowerCenter
  • In depth understanding of architecture, table partitioning, clustering, type of tables, best practices.
  • Proven experience as a Data Engineer, Software Developer, or similar.
  • Expert proficiency in Python, R, and SQL.
  • Candidates with Google Cloud certification will be preferred
  • Excellent analytical and problem-solving skills.
  • A knack for independent and group work.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Advanced English.
  • Be Extraordinary!

What are we offering?

  • Competitive salary
  • Law benefits:
    • 10 days of vacations to the first year fulfilled
    • IMSS
  • Additional benefits:
    • Contigo Membership (Insurance of minor medical expenses)
      • Personal accident policy.
      • Funeral assistance.
      • Dental and visual health assistance.
      • Emotional wellness.
      • Benefits & discounts.
      • Network of medical services and providers with a discount.
      • Medical network with preferential prices.
      • Roadside assistance with preferential price, among others.
    • 3 special permits a year, to go out to any type of procedure that you have to do half day equivalent
    • Half day off for birthdays
    • 5 days of additional vacations in case of marriage
    • 50% scholarship in language courses in the Anglo
    • Percentage scholarship in the study of graduates or masters with the Tec. de Mty.
    • Agreement with ticket company for preferential rates for events of entertainment.

See more jobs at Ingenia Agency

Apply for this job

20d

Data Engineer

In All Media IncArgentina Remote
agilesqlAWS

In All Media Inc is hiring a Remote Data Engineer

Data Engineer

In All Media

InallMedia.com is a Global community in charge of allocating and administrating complete teams according to our clients’ needs, always using an agile methodology.

At this moment, we are looking for a Data Engineer. This position is 100% remote and payable in USD.


Role Description

Our client is one of the biggest Job Boards of the world with a presence in 62 countries.

We are looking for Data Engineer with a solid experience in Amazon Web Services.

Must have requirements

  • EMR (elastic map-reduce) AWS computing structure
  • AWS Cloud Tools
  • Data lake
  • ETL Pipelines building experience
  • Spark
  • SQL
  • Experience with data points and visualizations that can be shared externally (Wide range of audiences)
  • Need to be able to interact cross-culturally and work with sensitive information

Nice to have requirements

  • Experience with analysis of Time and Geography information

Benefits

  • ???? USD Payment
  • ????100% remote
  • ???? Great Community
  • ???? Full-time, long-term
  • ????????Growth opportunities

See more jobs at In All Media Inc

Apply for this job

21d

Staff Data Engineer

Life36Remote, Canada
agileremote-firstterraformsqlDesignmobilec++pythonAWS

Life36 is hiring a Remote Staff Data Engineer

About Life360

Life360’s mission is to keep people close to the ones they love. Our category-leading mobile app and Tile tracking devices empower members to protect the people, pets, and things they care about most with a range of services, including location sharing, safe driver reports, and crash detection with emergency dispatch. Life360 serves approximately 66 million monthly active users (MAU) across more than 150 countries.

Life360 delivers peace of mind and enhances everyday family life with seamless coordination for all the moments that matter, big and small. By continuing to innovate and deliver for our customers, we have become a household name and the must-have mobile-based membership for families (and those friends that basically are family). 

Life360 has more than 500 (and growing!) remote-first employees. For more information, please visit life360.com.

Life360 is a Remote First company, which means a remote work environment will be the primary experience for all employees. All positions, unless otherwise specified, can be performed remotely (within Canada) regardless of any specified location above. 

About the Job

At Life360, we collect a lot of data: 60 billion unique location points, 12 billion user actions, 8 billion miles driven every single month, and so much more. As a Staff Data Engineer, you will contribute to enhancing and maintaining our data processing and storage pipelines/workflows for a robust and secure finance data lake. You should have a strong engineering background and even more importantly a desire to take ownership of our data systems to make them world class.

The Canada-based salary range for this position is$190,000to $240,000 CAD. We take into consideration an individual's background and experience in determining final salary- therefore, base pay offered may vary considerably depending on geographic location, job-related knowledge, skills, and experience. The compensation package includes a wide range of medical, dental, vision, financial, and other benefits, as well as equity.

What You’ll Do

Primary responsibilities include, but are not limited to:

  • Design, implement, and manage scalable data processing platforms used for real-time analytics and exploratory data analysis.
  • Manage our financial data from ingestion through ETL to storage and batch processing.
  • Automate, test and harden all data workflows.
  • Architect logical and physical data models to ensure the needs of the business are met.
  • Collaborate with finance and analytics teams, while applying best practices
  • Architect and develop systems and algorithms for distributed real-time analytics and data processing
  • Implement strategies for acquiring data to develop new insights
  • Mentor junior engineers, imparting best practices and institutionalizing efficient processes to foster growth and innovation within the team

 

What We’re Looking For

  • Minimum 5+ years of experience working with high volume data infrastructure.
  • Experience with Databricks, AWS, dbt, ETL and Job orchestration tooling.
  • Extensive experience programming in one of the following languages: Python / Java.
  • Experience in data modeling, optimizing SQL queries, and system performance tuning.
  • You are proficient with SQL, AWS, Databases, Apache Spark, Spark Streaming, EMR, and Kinesis/Kafka
  • Experience working with freemium models, tiered subscriptions, Subscriptions Billing Systems (Apple, Google, Recurly, Chargebee), integrations with systems like NetSuite Financials, and tax district and currency conversion providers.
  • Experience in modern development lifecycle including Agile methodology, CI/CD, automated deployments using Terraform, GitHub Actions etc.
  • Knowledge and proficiency in the latest open source and data frameworks, modern data platform tech stacks and tools.
  • Always be learning and staying up to speed with the fast moving data world.
  • You have good communication skills and can work independently
  • BS in Computer Science, Software Engineering, Mathematics, or equivalent experience

 

Our Benefits

  • Competitive pay and benefits
  • Medical, dental, vision, life and disability insurance plans 
  • RRSP plan with DPSP company matching program
  • Employee Assistance Program (EAP) for mental well being
  • Flexible PTO, several company wide days off throughout the year
  • Winter and Summer Week-long Synchronized Company Shutdowns
  • Learning & Development programs
  • Equipment, tools, and reimbursement support for a productive remote environment
  • Free Life360 Platinum Membership for your preferred circle
  • Free Tile Products

Life360 Values

Our company’s mission driven culture is guided by our shared values to create a trusted work environment where you can bring your authentic self to work and make a positive difference 

  • Be a Good Person - We have a team of high integrity people you can trust. 
  • Be Direct With Respect - We communicate directly, even when it’s hard.
  • Members Before Metrics - We focus on building an exceptional experience for families. 
  • High Intensity High Impact - We do whatever it takes to get the job done. 

Our Commitment to Diversity

We believe that different ideas, perspectives and backgrounds create a stronger and more creative work environment that delivers better results. Together, we continue to build an inclusive culture that encourages, supports, and celebrates the diverse voices of our employees. It fuels our innovation and connects us closer to our customers and the communities we serve. We strive to create a workplace that reflects the communities we serve and where everyone feels empowered to bring their authentic best selves to work.

We are an equal opportunity employer and value diversity at Life360. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status or any legally protected status.  

We encourage people of all backgrounds to apply. We believe that a diversity of perspectives and experiences create a foundation for the best ideas. Come join us in building something meaningful.Even if you don’t meet 100% of the below qualifications, you should still seriously consider applying!

 

#LI-Remote

____________________________________________________________________________

See more jobs at Life36

Apply for this job

21d

Data Engineer

agilejirascalasqlDesignpythonAWS

FuseMachines is hiring a Remote Data Engineer

Data Engineer - Fusemachines - Career Page // Set address placeholders $("#resumator-address-value, #resumator-city-value, #resumator-state-value, #resumator-postal-value"

See more jobs at FuseMachines

Apply for this job

25d

Principal Data Engineer

sliceBelfast or remote UK
scalanosqlsqlDesignjavapython

slice is hiring a Remote Principal Data Engineer

UK Remote or Belfast

Serial tech entrepreneur Ilir Sela started Slice in 2010 with the belief that local pizzerias deserve all of the advantages of major franchises without compromising their independence. Starting with his family’s pizzerias, we now empower over 18,000 restaurants (that’s nearly triple Domino’s U.S. network!) with the technology, services, and collective power that owners need to better serve their digitally minded customers and build lasting businesses. We’re growing and adding more talent to help fulfil this valuable mission. That’s where you come in.

 

The Challenge to Solve

Provide Slice with up to date data to grow the business and to empower independent pizzeria owners to make the best data driven decisions through insights that ensure future success.

 

The Role

You will be responsible for leading data modelling and dataset development across the team. You’ll be at the forefront of our data strategy. Partnering closely with business and product teams,  to fuel data-driven decisions throughout the company. Your leadership will guide our data architecture expansion, ensuring smooth data delivery and maintaining top-notch data quality. Drawing on your expertise, you’ll steer our tech choices and keep us at the cutting edge of the field. You’ll get to code daily and provide your insights into best practices to the rest of the team.

 

The Team

You’ll work with a team of skilled data engineers daily, providing your expertise to their reviews, as well as working on your own exciting projects with teams across the business. You’ll have a high degree of autonomy and the chance to impact many areas of the business. You will optimise data flow and collection for cross-functional teams and support software developers, business intelligence, and data scientists on data initiatives using this to help to support product launches and supporting Marketing efforts to grow the business. This role reports to the Director of Data Engineering.

 

The Winning Recipe

We’re looking for creative, entrepreneurial engineers who are excited to build world-class products for small business counters. These are the core competencies this role calls for:

  • Strong track record of designing and implementing modern cloud data processing architectures using programming languages such as Java, Scala, or Python and technologies like Spark
  • Expert-level SQL skills
  • Extensive experience in data modelling and design, building out the right structures to deliver for various business and product domains
  • Strong analytical abilities and a history of using data to identify opportunities for improvement and where data can help drive the business towards its goals
  • Experience with message queuing, stream processing using frameworks such as Flink or KStreams and highly scalable big data data stores, as well as storage and query pattern design with NoSQL stores
  • Proven leadership skills, with a track record of successfully leading complex engineering projects and mentoring junior engineers, as well as working with cross-functional teams and external stakeholders in a dynamic environment
  • Familiarity with serverless technologies and the ability to design and implement scalable and cost-effective data processing architectures

 

The Extras

Working at Slice comes with a comprehensive set of benefits, but here are some of the unexpected highlights:

  • Access to medical, dental, and vision plans
  • Flexible working hours
  • Generous time off policies
  • Annual conference attendance and training/development budget
  • Market leading maternity and paternity schemes
  • Discounts for local pizzerias (of course)

 

The Hiring Process

Here’s what we expect the hiring process for this role to be, should all go well with your candidacy. This entire process is expected to take 1-3 weeks to complete and you’d be expected to start on a specific date.

  1. 30 minute introductory meeting
  2. 30 minute hiring manager meeting
  3. 60 minute pairing interview
  4. 60 minute technical interview
  5. 30 minute CTO interview
  6. Offer!

Pizza brings people together. Slice is no different. We’re an Equal Opportunity Employer and embrace a diversity of backgrounds, cultures, and perspectives. We do not discriminate on the basis of race, colour, gender, sexual orientation, gender identity or expression, religion, disability, national origin, protected veteran status, age, or any other status protected by applicable national, federal, state, or local law. We are also proud members of the Diversity Mark NI initiative as a Bronze Member.

Privacy Notice Statement of Acknowledgment

When you apply for a job on this site, the personal data contained in your application will be collected by Slice. Slice is keeping your data safe and secure. Once we have received your personal data, we put in place reasonable and appropriate measures and controls to prevent any accidental or unlawful destruction, loss, alteration, or unauthorised access. If selected, we will process your personal data for hiring /employment processes, as well as our legal obligations. If you are not selected for the job position and you have given consent on the question below (by selecting "Give consent") we will store and process your personal data and submitted documents (CV) to consider eligibility for employment up to 365 days (one year). You have the right to withdraw your previously given consent for storing your personal data and CV in the Slice database considering eligibility for employment for a year. You have the right to withdraw your consent at any time.For additional information and / or exercise of your rights to the protection of personal data, you can contact our Data Protection Officer, e-mail:privacy@slicelife.com

See more jobs at slice

Apply for this job

26d

Senior Data Engineer

Nile BitsCairo, Egypt, Remote
agileairflowsqlDesigndockerlinuxpythonAWS

Nile Bits is hiring a Remote Senior Data Engineer

Job Description

  • Designing and implementing core functionality within our data pipeline in order to support key business processes
  • Shaping the technical direction of the data engineering team
  • Supporting our Data Warehousing approach and strategy
  • Maintaining our data infrastructure so that our jobs run reliably and at scale
  • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
  • Mentoring and upskilling other members of the team

Qualifications

  • Experience building data pipelines and/or ETL processes
  • Experience working in a Data Engineering role
  • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
  • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
  • Confident writing SQL and good understanding of database design.
  • Experience working with web APIs.
  • Experience leading projects from a technical perspective
  • Knowledge of Docker, shell scripting, working with Linux
  • Experience with a cloud data warehouse
  • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
  • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
  • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
  • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
  • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
  • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
  • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
  • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
  • Experience with AWS services or those of another cloud provider
  • Experience with Snowflake
  • Good understanding of Agile

See more jobs at Nile Bits

Apply for this job

+30d

Data Engineer

MetioraMadrid, Spain, Remote
nosqlsqlazuregitc++dockerlinuxpythonAWS

Metiora is hiring a Remote Data Engineer

Descripción del empleo

Estamos buscando a un/a #excepcional Data engineer ???? que sea capaz de entender los retos de nuestros clientes, hacerlos suyos y que nos ayude a establecer relaciones a largo plazo con ellos, garantizando el éxito y la ejecución de los proyectos, teniendo como funciones:

  • Desarrollar procesos de integración con nuestros clientes para poder explotar sus datos desde nuestra plataforma MINEO y plataformas cloud del cliente (Azure, AWS, GCP)
  • Ayudar a la mejora de las actuales herramientas de integración (ETL)
  • Entender el dato del cliente y poder anticiparse a los posibles problemas que puedan surgir
  • Elaborar análisis de los cambios y las nuevas funcionalidades que se vayan a desarrollar
  • Realizar code reviews de las tareas llevadas a cabo por los compañeros

¿Qué esperamos de tu perfil profesional? 

  • Grado en carreras STEM,  Matemáticas, Estadística, ingeniería de Telecomunicaciones o Informática
  • Formación adicional en ciencia de datos
  • Al menos entre 2-5 años de experiencia en proyectos reales
  • Proactividad y pasión por la tecnología
  • Ganas de trabajar en equipo
  • Curiosidad intelectual y persistencia para resolver problemas
     

Requisitos

  • Grado o Máster en Ingeniería informática o titulación similar

  • Background en desarrollo de software. Tienes que ser capaz de entender  y desarrollar código, especialmente Python

  • Manejo avanzado de bases de datos SQL y NoSQL

  • Conocimientos avanzados de algoritmia

  • Conocimiento en Git

Soft skills:

  • Ganas de trabajar en equipo

  • Enfocado en la calidad, escalabilidad y código limpio

  • Curiosidad intelectual y persistencia para resolver problemas

  • Buen nivel de inglés

  • Proactividad y pasión por la tecnología

Se valorará muy positivamente:

  • Soluciones de "containerization" (Docker)

  • Conocimientos de algoritmia para desarrollo de procesos de grandes cantidades de datos

  • Conocimiento de sistema Linux

See more jobs at Metiora

Apply for this job

+30d

Data Engineer Azure

4 years of experience2 years of experienceagileBachelor's degreetableaujirascalaairflowpostgressqloracleDesignmongodbpytestazuremysqljenkinspythonAWS

FuseMachines is hiring a Remote Data Engineer Azure

Data Engineer Azure - Fusemachines - Career Page
+30d

Data Services Engineer

DevoteamWarszawa, Poland, Remote
agilenosqlDesignazurejavapythonAWS

Devoteam is hiring a Remote Data Services Engineer

Job Description

We are looking for a highly motivated Data Engineer to use their experience in the public cloud to interpret our customers’ needs and extract value from their data using GCP and its suite of tools. You will primarily work as part of the data team.

You will be involved in pre-sales activities, building upon our customers’ cloud infrastructure and creating analytics in the cloud. You will be involved in designing and constructing data pipelines and architecture, using GCP products. You will help turn big data into valuable business insights using Machine Learning. Programming and preparing custom solutions optimised for each client will be an essential part of your job. We believe that working with data is your passion that you want to share with others.

We want you to know and advocate for Google Cloud Platform and its products. Knowledge in equivalent platforms could be a valuable asset, i.e. AWS or Azure, to make you a strong candidate.

It’s a customer-facing role, so you must be outgoing and confident in your ability to interact with clients, manage workshops and execute necessary training on your own.

Qualifications

  • BA/BS in Computer Science or related technical field, or equivalent experience
  • Experience in one or more development languages, with a strong preference for Python, Java
  • Good knowledge of Power BI 
  • Basic knowledge of Bash 
  • Experience with programming in agile methodology
  • Knowledge of database and data analysis technologies, including relational and NoSQL databases, data warehouse design, and ETL pipeline.
  • Fluency in English (both written and spoken)

Nice to have:

  • Experience in drawing UML diagrams
  • Experience in Hadoop and using platforms such as Apache Spark, Pig, or Hive
  • Fluency in Polish (both written and spoken)

See more jobs at Devoteam

Apply for this job

+30d

Data Engineer

LegalistRemote
agilenosqlsqlDesignc++dockerkubernetesAWS

Legalist is hiring a Remote Data Engineer

Intro description:

Legalist is an institutional alternative asset management firm. Founded in 2016 and incubated at Y Combinator, the firm uses data-driven technology to invest in credit assets at scale. We are always looking for talented people to join our team.

As a highly collaborative organization, our data engineers work cross-functionally with software engineering, data science, and product management to optimize growth and strategy of our data pipeline. In this position, you will be joining the data engineering team in an effort to take our data pipeline to the next level.

Where you come in:

  • Design and develop scalable data pipelines to collect, process, and analyze large volumes of data efficiently.
  • Collaborate with cross-functional teams including data scientists, software engineers, and product managers to understand data requirements and deliver solutions that meet business needs.
  • Develop ELT processes to transform raw data into actionable insights, leveraging tools and frameworks such as Airbyte, BigQuery, Dagster, DBT or similar technologies.
  • Participate in agile development processes, including sprint planning, daily stand-ups, and retrospective meetings, to deliver iterative improvements and drive continuous innovation.
  • Apply best practices in data modeling and schema design to ensure data integrity, consistency, and efficiency.
  • Continuously monitor and optimize data pipelines and systems for performance, availability, scalability, and cost-effectiveness.

What you’ll be bringing to the team:

  • Bachelor’s degree (BA or BS) or equivalent.
  • A minimum of 2 years of work experience in data engineering or similar role.
  • Advanced SQL knowledge and experience working with a variety of databases (SQL, NoSQL, Graph, Multi-model).
  • A minimum of 2 years professional experience with ETL//ELT, data modeling and Python.
  • Familiarity with cloud environments like GCP, AWS, as well as cloud solutions like Kubernetes, Docker, BigQuery, etc.
  • You have a pragmatic, data-driven mindset and are not dogmatic or overly idealistic about technology choices and trade-offs.
  • You have an aptitude for learning new things quickly and have the confidence and humility to ask clarifying questions.

Even better if you have, but not necessary:

  • Experience with one or more of the following: data processing automation, data quality, data warehousing, data governance, business intelligence, data visualization.
  • Experience working with TB scale data.

See more jobs at Legalist

Apply for this job

+30d

Sr. Data Engineer

VeriskJersey City, NJ, Remote
4 years of experienceagileBachelor's degreesqlDesignc++python

Verisk is hiring a Remote Sr. Data Engineer

Job Description

Verisk Insurance Solutions is a leading source of information about property/casualty insurance risk. For a broad spectrum of commercial and personal lines of insurance, Verisk provides statistical, actuarial, underwriting, and claims information and analytics; compliance and fraud identification tools; policy language; information about specific locations; and technical services. Verisk serves insurers, reinsurers, agents and brokers, insurance regulators, risk managers, and other participants in the property/casualty insurance marketplace. 

The Analytical Data Operations (ADO) team is responsible for managing the lifeblood of our position within the US P&C Insurance industry, our data. The Team ensures that our product teams have the data they need, when they need it, and with the trust in our systems and processes to ensure they have their hands on the pulse of the industry. Our team is a combination of data analysts, data engineers, BI developers, and actuarial-engineers that are responsible for the design and implementation of the country’s largest database of P&C policy and claims information from data ingestion, data integration, data transformation, data analysis, to BI development.  

The ADO team is looking to hire an exceptional data analyst interested in a career as Data Engineer, ideally having a good combination of an analytical/innovative mindset, technical aptitude, business acumen, communication skills, and a passion for data and technology. For internal candidates, this position is open to grade level 14.  

This role entails blending insurance expertise, data analysis proficiency, and effective communication skills, with an emphasis on project management. We are looking for a candidate that can manage projects in multiple work-streams with an array of internal customers while at the same time support the development of modern data applications including database architecture, data pipelines, and data analysis. 

As an aspiring Data Engineer working with Insurance data, you will... 

  • Build and maintain robust data engineering processes to develop and implement self-serve data 
  • Support product implementation on a new and innovative technology platform with product requirements, actuarial calculations, methods, and validation 
  • Design, build, and launch efficient & reliable data pipelines to move data (both large and small amounts) in/out of our Snowflake Data Lake 
  • Assist internal stakeholders, including data modelers, with assumption development, product development, and implementation  
  • Execute data engineering projects ranging from small to large either individually or as part of a project team 
  • Perform other tasks on R&D, data governance, system infrastructure, and other cross team functions on an as-needed basis 
  • Adopt an agile framework to schedule work, adjust as needed, and continuously improve performance 
  • Work with data analysts to develop reports and visualizations 
  • Assist in scoping and designing analytic data assets 
  • Find opportunities to create, automate, and scale repeatable analyses or build self-service tools for business users 

You will be part of a culture that embraces learning and innovation, values teamwork, recognizes and rewards achievements and excellence, and provides personal and professional enrichment opportunities. 

Qualifications

  • Bachelor's degree in a STEM major or with STEM coursework learned in associated majors (Actuarial Science, Computer Science, Data Engineering, Data Science, Mathematics, Applied Mathematics, Statistics, Finance, Economics)  
  • Proficient in SQL, Database Architectures, Data Pipelines, and at least one scripting/analytics language (Python preferred) 
  • 3-4 years of Experience in an actuarial, data analysis, or data engineering role 
  • Experience in the property & casualty insurance industry preferred
  • Experience with a business intelligence tool (other than excel) and understanding of data visualization theory 
  • Homeowners and/or Dwelling experience is a plus
  • Excellent communication skills (both oral and written) are required, with a desire to improve presentation and persuasion skills 
  • A self-starter with a commitment to innovation and pro-active problem solving 
  • A sincere interest in transforming the insurance industry through data and analytics to improve people’s lives 

#LI-MC1 #LI-Hybrid

See more jobs at Verisk

Apply for this job

+30d

Senior Data Engineer

IndigoRemote with Frequent Travel
sqlDesignpythonAWSjavascriptNode.js

Indigo is hiring a Remote Senior Data Engineer

Company Description

Healthcare providers spend roughly $20B annually on premiums for medical professional liability (“MPL”) insurance, $5-6B of which is spent for physicians. Incumbent carriers utilize outdated risk selection, underwriting, and sales processes. For the first time in 10 years, The MPL market is currently in a hardening cycle. While incumbent carriers are increasing rates to make up for underwriting losses, the environment is ripe for an innovative disruptor to capture market share by deploying artificial intelligence.

Rubicon Founders, an entrepreneurial firm focused on building and growing transformational healthcare companies, has launched Indigo to address this issue. Backed by Oak HC/FT, this company will disrupt the medical professional liability market by offering more competitive products to providers. Benefitting from significant potential cost savings as a result, providers may reallocate resources to invest more directly in patient or staff care. This company intends to fulfill Rubicon Founder’s mission of creating enduring value by impacting people in a measurable way.   

Position Description

Are you an expert data engineer with a passion for making a difference in the healthcare industry? If so, we want you to join our team at Indigo, where we're using technology to transform medical professional liability insurance. Our AI driven underwriting process and modern technology streamline the insurance process and equip quality physicians and groups with the coverage they need.

As a senior data engineer you will own the various data components of our system, which includes our data model architecture, our data pipeline / ETL, and any ML data engineering needs. You will help us leverage existing data platforms and databases with best practices to ensure that the data strategy is coherent, that our data systems are highly available, secure, and scalable. You will work with cross-functional teams including business stakeholders, product managers, and data scientists to ensure our data pipeline system meets business requirements and adheres to best practices.  

 

We are a remote distributed team who gathers with intention. You will thrive here if you find energy working from home and getting together to build relationships. At Indigo, you'll have the opportunity to contribute to a meaningful mission that makes a difference in the lives of healthcare providers and their patients.

 

Responsibilities:

  • Design and build the Indigo data pipeline and ETL process.
  • Design and build the Indigo application datastore architecture and implementation 
  • Define a governance and lineage strategy.
  • Define and implement security across our data systems.
  • Participate in code reviews and ensure the code is of high quality.
  • Collaborate with cross-functional teams including product and engineering to understand business requirements and translate them into technical requirements.
  • Keep up to date with emerging trends in data engineering and contribute to improving our development processes.

Requirements:

  • Bachelor’s degree in CS or similar; masters advantageous
  • 5+ years of data engineering experience 
  • Expert SQL and Python or Javascript (Node.js)
  • Experience with AWS services such as EC2, Lambda, RDS, DynamoDB, and S3.
  • Experience with Snowflake, dbt
  • Strong understanding of data design principles
  • Experience designing and building data products from concept to productization.
  • Experience with automation in the processes of data definition, migration.
  • Excellent design and problem-solving skills.
  • Self starter, highly autonomous, thrives in a remote distributed environment



See more jobs at Indigo

Apply for this job

+30d

Senior Data Engineer

airflowpostgressqloracleDesigndockermysqlkubernetespythonAWS

ReCharge Payments is hiring a Remote Senior Data Engineer

Who we are

In a world where acquisition costs are skyrocketing, funding is scarce, and ecommerce merchants are forced to do more with less, the most innovative DTC brands understand that subscription strategy is business strategy.

Recharge is simplifying retention and growth for innovative ecommerce brands. As the #1 subscription platform, Recharge is dedicated to empowering brands to easily set up and manage subscriptions, create dynamic experiences at every customer touchpoint, and continuously evaluate business performance. Powering everything from no-code customer portals, personalized offers, and dynamic bundles, Recharge helps merchants seamlessly manage, grow, and delight their subscribers while reducing operating costs and churn. Today, Recharge powers more than 20,000 merchants serving 100 million subscribers, including brands such as Blueland, Hello Bello, LOLA, Chamberlain Coffee, and Bobbie—Recharge doesn’t just help you sell products, we help build buyer routines that last.

Recharge is recognized on the Technology Fast 500, awarded by Deloitte, (3rd consecutive year) and is Great Place to Work Certified.

Overview

The centralized Data and Analytics team at Recharge delivers critical analytic capabilities and insights for Recharge’s business and customers. 

As a Senior Data Engineer, you will build scalable data pipelines and infrastructure that power internal business analytics and customer-facing data products.  Your work will empower data analysts to derive deeper strategic insights from our data, and will  enable developers to build applications that surface data insights directly to our merchants. 

What you’ll do

  • Build data pipeline, ELT and infrastructure solutions to power internal data analytics/science and external, customer-facing data products.

  • Create automated monitoring, auditing and alerting processes that ensure data quality and consistency.

  • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models

  • Design, develop, implement, and optimize existing ETL processes that merge data from disparate sources for consumption by data analysts, business owners, and customers

  • Seek ways to continually improve the operations, monitoring and performance of the data warehouse

  • Influence and communicate with all levels of stakeholders including analysts, developers, business users, and executives.

  • Live by and champion our values: #day-one, #ownership, #empathy, #humility.

What you’ll bring

  • Typically, 5+ years experience in a data engineering related role (Data Engineer, Data Platform Engineer, Analytics Engineer etc) with a track record of building scalable data pipeline, transformation, and platform solutions. 

  • 3+ years of hands-on experience designing and building data pipelines and models  to ingesting, transforming and delivery of large amounts of data, from multiple sources into a Dimensional (Star Schema) Data Warehouse, Data Lake.

  • Experience with a variety of data warehouse, data lake and enterprise data management platforms (Snowflake {preferred}, Redshift, databricks, MySQL, Postgres, Oracle,  RDS, AWS, GCP)

  • Experience building data pipelines, models and infrastructure powering external, customer-facing (in addition to internal business facing) analytics applications.

  • Solid grasp to data warehousing methodologies like Kimball and Inmon

  • Experience working with a variety of ETL tools. (FiveTran, dbt, Python etc)

  • Experience with workflow orchestration management engines such as Airflow & Cloud Composer

  • Hands on experience with Data Infra tools like Kubernetes, Docker

  • Expert proficiency in SQL

  • Strong Python proficiency

  • Experience with ML Operations is a plus.

Recharge | Instagram | Twitter | Facebook

Recharge Payments is an equal opportunity employer. In addition to EEO being the law, it is a policy that is fully consistent with our principles. All qualified applicants will receive consideration for employment without regard to status as a protected veteran or a qualified individual with a disability, or other protected status such as race, religion, color, national origin, sex, sexual orientation, gender identity, genetic information, pregnancy or age. Recharge Payments prohibits any form of workplace harassment. 

Transparency in Coverage

This link leads to the Anthem Blue Cross machine-readable files that are made available in response to the federal Transparency in Coverage Rule and includes network negotiated rates for all items and services; allowed amounts for OON items, services and prescription drugs; and negotiated rates and historical prices for network prescription drugs (delayed). EIN 80-6245138. This link leads to the Kaiser machine-readable files.

#LI-Remote

See more jobs at ReCharge Payments

Apply for this job

+30d

Senior Data Engineer

InvocaRemote
Bachelor's degreesqlsalesforceDesignswiftazureapic++postgresqlmysqlpythonAWS

Invoca is hiring a Remote Senior Data Engineer

About Invoca:

Invoca is the industry leader and innovator in AI and machine learning-powered Conversation Intelligence. With over 300 employees, 2,000+ customers, and $100M in revenue, there are tremendous opportunities to continue growing the business. We are building a world-class SaaS company and have raised over $184M from leading venture capitalists including Upfront Ventures, Accel, Silver Lake Waterman, H.I.G. Growth Partners, and Salesforce Ventures.

About the Engineering Team:

You’ll join a team where everyone, including you, is striving to constantly improve their knowledge of software development tools, practices, and processes. We are an incredibly supportive team. We swarm when problems arise and give excellent feedback to help each other grow. Working on our close-knit, multi-functional teams is a chance to share and grow your knowledge of different domains from databases to front ends to telephony and everything in between.

We are passionate about many things: continuous improvement, working at a brisk but sustainable pace, writing resilient code, maintaining production reliability, paying down technical debt, hiring fantastic teammates; and we love to share these passions with each other.

Learn more about the Invoca development team on our blog and check out our open source projects.

You Will:

Invoca offers a unique opportunity to make massive contributions to machine learning and data science as it applies to conversation intelligence, marketing, sales and user experience optimization.

You are excited about this opportunity because you get to:

  • Design and develop highly performant and scalable data storage solutions
  • Extend and enhance the architecture of Invoca’s data infrastructure and pipelines
  • Deploy and fine-tune machine learning models within an API-driven environment, ensuring scalability, efficiency, and optimal performance.
  • Expand and optimize our Extract, Transform, and Load (ETL) processes to include various structured and unstructured data sources within the Invoca Platform.
  • Evaluate and implement new technologies as needed and work with technical leadership to drive adoption.
  • Collaborate with data scientists, engineering teams, analysts, and other stakeholders to understand data requirements and deliver solutions on behalf of our customers
  • Support diversity, equity and inclusion at Invoca

At Invoca, our Senior Data Engineers benefit from mentorship provided by experts spanning our data science, engineering, and architecture teams. Our dedicated data science team is at the forefront of leveraging a blend of cutting-edge technology, including our proprietary and patented solutions, along with tools from leading vendors, to develop an exceptionally scalable data modeling platform.

Our overarching objective is to seamlessly deliver models through our robust API platform, catering to both internal stakeholders and external clients. Your pivotal role will focus on optimizing model accessibility and usability, thereby expediting model integration within our feature engineering teams. Ultimately, this streamlined process ensures swift model adoption, translating to enhanced value for our customers.

 

You Have:

We are excited about you because you have:

  • 3+ years of professional experience in Data Engineering or a related area of data science or software engineering
  • Advanced proficiency in Python, including expertise in data processing libraries (e.g.,  spaCy, Pandas), data visualization libraries (e.g., Matplotlib, Plotly), and familiarity with machine learning frameworks
  • Advanced proficiency using python API frameworks (e.g., FastAPI, Ray/AnyScale, AWS Sagemaker) to build, host. and optimize machine learning model inference APIs.
  • Intermediate proficiency working with the Databricks platform (e.g. Unity Catalog, Job/Compute, Deltalake) (or similar platform) for data engineering and analytics tasks
  • Intermediate proficiency working with the Machine Learning and Large Language Model (LLM) tools from AWS (e.g., Sagemaker, Bedrock) or other cloud vendors such Azure, or Google Cloud Platform
  • Intermediate proficiency with big data technologies and frameworks  (e.g., Spark, Hadoop)
  • Intermediate proficiency with SQL and relational databases (e.g., MySQL, PostgreSQL)
  • Basic proficiency in several areas apart from pure coding, such as monitoring, performance optimization, integration testing, security and more
  • Basic proficiency with Kafka (or similar stream-processing software) is a plus
  • Bachelor's Degree or equivalent experience preferred

Salary, Benefits & Perks:

Teammates begin receiving benefits on the first day of the month following or coinciding with one month of employment. Offerings include:

  • Paid Time Off -Invoca encourages a work-life balance for our employees. We have an outstanding PTO policy starting at 20 days off for all full-time employees. We also offer 16 paid holidays, 10 days of Compassionate Leave, days of volunteer time, and more.
  • Healthcare -Invoca offers a healthcare program that includes medical, dental, and vision coverage. There are multiple plan options to choose from. You can make the best choice for yourself, your partner, and your family.
  • Retirement - Invoca offers a 401(k) plan through Fidelity with a company match of up to 4%.
  • Stock options - All employees are invited to ownership in Invoca through stock options.
  • Employee Assistance Program -Invoca offers well-being support on issues ranging from personal matters to everyday-life topics through the WorkLifeMatters program.
  • Paid Family Leave -Invoca offers up to 6 weeks of 100% paid leave for baby bonding, adoption, and caring for family members.
  • Paid Medical Leave - Invoca offers up to 12 weeks of 100% paid leave for childbirth and medical needs.
  • Sabbatical -We thank our long-term team members with an additional week of PTO and a bonus after 7 years of service.
  • Wellness Subsidy - Invoca provides a wellness subsidy applicable to a gym membership, fitness classes, and more.
  • Position Base Range - $139,500 to $175,000Salary Range / plus bonus potential
  • Please note, per Invoca's COVID-19 policy, depending on your vaccine verification status, you may be required to work only from home / remotely. At this time, travel and in-person meetings will require verification. This policy is regularly reviewed and subject to change at any time

Recently, we’ve noticed a rise in phishing attempts targeting individuals who are applying to our job postings. These fraudulent emails, posing as official communications from Invoca aim to deceive individuals into sharing sensitive information. These attacks have attempted to use our name and logo, and have tried to impersonate individuals from our HR team by claiming to represent Invoca. 

We will never ask you to send financial information or other sensitive information via email. 

 

DEI Statement

We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity workplace.

#LI-Remote

See more jobs at Invoca

Apply for this job

+30d

Principal Data Engineer

GeminiRemote (USA)
remote-firstnosqlairflowsqlDesigncsspythonjavascript

Gemini is hiring a Remote Principal Data Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Analytics

The Role: Principal Data Engineer

As a member of our data engineering team, you'll be setting standards for data engineering solutions that have organizational impact. You'll provide Architectural solutions that are efficient, robust, extensible and are competitive within business and industry context. You'll collaborate with senior data engineers and analysts, guiding them towards their career goals at Gemini. Communicating your insights with leaders across the organization is paramount to success.

Responsibilities:

  • Focused on technical leadership, defining patterns and operational guidelines for their vertical(s)
  • Independently scopes, designs, and delivers solutions for large, complex challenges
  • Provides oversight, coaching and guidance through code and design reviews
  • Designs for scale and reliability with the future in mind. Can do critical R&D
  • Successfully plans and delivers complex, multi-team or system, long-term projects, including ones with external dependencies
  • Identifies problems that need to be solved and advocates for their prioritization
  • Owns one or more large, mission-critical systems at Gemini or multiple complex, team level projects, overseeing all aspects from design through implementation through operation
  • Collaborates with coworkers across the org to document and design how systems work and interact
  • Leads large initiatives across domains, even outside their core expertise. Coordinates large initiatives
  • Designs, architects and implements best-in-class Data Warehousing and reporting solutions
  • Builds real-time data and reporting solutions
  • Develops new systems and tools to enable the teams to consume and understand data more intuitively

Minimum Qualifications:

  • 10+ years experience in data engineering with data warehouse technologies
  • 10+ years experience in custom ETL design, implementation and maintenance
  • 10+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Advanced skills with Python and SQL are a must
  • Experience and expertise in Databricks, Spark, Hadoop etc.
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
  • Strong computer science fundamentals including data structures and algorithms
  • Strong software engineering skills in any server side language, preferable Python
  • Experienced in working collaboratively across different teams and departments
  • Strong technical and business communication skills

Preferred Qualifications:

  • Kafka, HDFS, Hive, Cloud computing, machine learning, LLMs, NLP & Web development experience is a plus
  • NoSQL experience a plus
  • Deep knowledge of Apache Airflow
  • Expert experience implementing complex, enterprise-wide data transformation and processing solutions
  • Experience with Continuous integration and deployment
  • Knowledge and experience of financial markets, banking or exchanges
  • Web development skills with HTML, CSS, or JavaScript
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $172,000 - $215,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-AH1

Apply for this job