scala Remote Jobs

123 Results

9h

Senior Software Engineer (Scala/Java/Angular/Python)

Tech9Remote
kotlinscalaDesignscrumtypescriptAWS

Tech9 is hiring a Remote Senior Software Engineer (Scala/Java/Angular/Python)

Senior Software Engineer (Scala/Java/Angular/Python) - Tech9 - Career Page (function(d, s, id) { var js, iajs = d.getElementsByTagName(s)[0]; if (d.getElementById(id))

See more jobs at Tech9

Apply for this job

1d

Principal Analyst/Engineer, Attack Surface Management

SecurityScorecardRemote (United States)
Bachelor's degreescalaDesignc++python

SecurityScorecard is hiring a Remote Principal Analyst/Engineer, Attack Surface Management

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of theWorld’s Most Innovative Companies for 2023and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Role:

This role is crucial for maintaining the continuous accuracy and completeness of our customers' digital footprint data. The position demands an in-depth understanding of networking protocols such as TCP/IP, DNS, BGP, SSL and an understanding of the fundamentals of how the Internet works. Responsibilities include validating the attribution of digital assets, managing asset claims, addressing inaccuracies, and promptly updating the digital footprint as necessary. The ideal candidate will have a background in researching, designing and deploying Internet facing technologies, preferably in telcos. The candidate will proactively identify and resolve discrepancies and identify directional innovations to the digital attribution system. This role requires a proactive approach and a deep understanding of how digital assets are managed and assigned by Telcos/ISPs.

Job Responsibilities:

  • Validate and Maintain Digital Footprint data:Regularly review and validate the accuracy of how digital assets are attributed to organizations. Ensure that all internet-facing assets are correctly attributed and reflect the current status.
  • Asset Management & Discovery: Research and design new methods to correctly discover and attribute digital assets to organizations. You will also work with key stakeholders to understand customer needs and the nuances on how their organization is reflected on the Internet.
  • Issue Resolution:Address and resolve issues found within the digital footprint, such as misattributions or outdated information. Work closely with the cybersecurity team to understand the impact of these issues on security ratings.
  • Collaboration and Reporting: Work collaboratively with technical and non-technical teams to gather asset data, clarify asset status, and report on footprint changes and their impacts. Provide insights and recommendations based on digital footprint analysis.
  • Continuous Improvement: Contribute to the improvement of methodologies for digital footprint analysis and management. Participate in the development of new tools and processes to enhance the team’s capabilities. As well as keeping an eye on associated engineering costs. 

Required Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Cybersecurity, or a related field.
  • 2+ years of experience in cybersecurity, IT asset management, or a related field.
  • Familiarity with the various internet registries such as ARIN, RIPE NCC, APNIC, etc 
  • Strong understanding of network infrastructure, BGP, DNS, WHOIS, and IP management.
  • Proficient in data analysis and capable of interpreting complex data related to network security.
  • Experience with cybersecurity tools and platforms, especially those related to asset management and network scanning.
  • Strong problem-solving skills and the ability to operate effectively under tight deadlines.
  • Experience with distributed data processing frameworks (Spark and or Flink)
  • Proficient in Scala, Python or Golang
  • Experience with various data formats

Preferred Qualifications:

  • Certifications such as CISSP, CISM, or related credentials.
  • Experience with scripting languages for data manipulation and automation.
  • Knowledge of regulatory compliance standards relevant to cybersecurity and data protection.

Additional Skills:

  • Excellent communication skills, both written and verbal.
  • Strong organizational skills with the ability to manage multiple priorities.
  • Proactive attitude and a strong team player.
  • Experience with Kafka

Benefits:

Specific to each country, we offer a competitive salary, stock options, Health benefits, and unlimited PTO, parental leave, tuition reimbursements, and much more!

The estimated salary range for this position is $225,000-240,000. Actual compensation for the position is based on a variety of factors, including, but not limited to affordability, skills, qualifications and experience, and may vary from the range. In addition to base salary, employees may also be eligible for annual performance-based incentive compensation awards and equity, among other company benefits. 

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.  Please note that we do not provide immigration sponsorship for this position. 

See more jobs at SecurityScorecard

Apply for this job

2d

Senior Lead Backend Engineer - (LS)

ITScoutArgentina, AR Remote
agilescalapostgresDesignrubyAWSbackend

ITScout is hiring a Remote Senior Lead Backend Engineer - (LS)

⚠️Only available for #residents of #Argentina ⚠️

Our client is a fresh and open-minded nearshore software development company that emerged as a spin-off from UNICEN, one of the most renowned universities in Argentina in the technology sector. From nationwide digital clinical records management systems to tailored e-commerce solutions. From distributed learning platforms with cross-country implementation to IOT custom applications and cloud solutions, their developments embrace all sectors and scales, ensuring industry quality standards.
Their software development, quality assurance processes, UX/UI knowledge and global market experience are mixed with their R&D DNA, which allows them to explore new roads and build amazing things every day.
Expanding and sharing their knowledge and collaboration along with their flexibility and an inclusive working environment are core values. They focus on enjoying what they do and learning from every experience while delivering on their commitments.
Innovation and passion are their drivers, so they are invested in solving complex business scenarios to reach new milestones with partners.


Fintech enthusiasts, we're looking for you!

Join a high-caliber team using GoLang, Scala & Ruby on Rails to build the future of a customer-facing platform.

Be a leader in the trenches:

  • Co-architect, design, and implement the full backend stack.
  • Collaborate with product & design to ensure exceptional user experiences.
  • Own software delivery, from concept to production.
  • Drive agile development within a cross-functional team.
  • Lead code reviews, championing best practices.

Bonus points for:

  • Plaid integration experience.
  • Building applications at scale.
  • Microservices and event-based architectures.⚡
  • Sidekiq, Postgres, Redis, Kafka, AWS familiarity.
  • Mentorship experience.

    The biggest plus? GoLang expertise and experience with 3rd-party Bank or Card Linking providers like Plaid!


See more jobs at ITScout

Apply for this job

4d

Senior Data Engineer, UK 2024

Aimpoint DigitalLondon, GB Remote
scalasqlDesignazuregitjavac++dockerkubernetespythonAWS

Aimpoint Digital is hiring a Remote Senior Data Engineer, UK 2024

What you will do

Do you enjoy working with clients from different industries to investigate complex business problems and to design end-to-end analytical solutions that will improve their existing processes and ability to derive data-driven insights?

Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. We work alongside the most innovative software providers in the data engineering space to solve our clients' toughest business problems. At Aimpoint Digital, we believe in blending modern tools and techniques with tried-and-true principles to deliver optimal data engineering solutions.

You will:

  • Become a trusted advisor working together with our clients, from data owners and analytic users to C-level executives
  • Work independently as part of a small team to solve complex data engineering use-cases across a variety of industries
  • Design and develop the analytical layer, building cloud data warehouses, data lakes, ETL/ELT pipelines, and orchestration jobs
  • Work with modern tools such as Snowflake, Databricks, Fivetran, and dbt and credentialize your skills with certifications
  • Write code in SQL, Python, and Spark, and use software engineering best-practices such as Git and CI/CD
  • Support the deployment of data science and ML projects into production
    • Note: You will not be developing machine learning models or algorithms

Who you are:

We are building a diverse team of talented and motivated people who deeply understand business problems and enjoy solving them. You are a self-starter who loves working with data to build analytical tools that business users can leverage daily to do their jobs better. You are passionate about contributing to a growing team and establishing best practices.

As a Senior Data Engineer, you will be expected to work independently on client engagements, take part in the development of our practice, aid in business development, and contribute innovative ideas and initiatives to our company.

  • Degree educated in Computer Science, Engineering, Mathematics, or equivalent experience
  • Experience with managing stakeholders and collaborating with customers
  • Strong written and verbal communication skills required
  • 3+ years working with relational databases and query languages
  • 3+ years building data pipelines in production and ability to work across structured, semi-structured and unstructured data
  • 3+ years data modeling (e.g. star schema, entity-relationship)
  • 3+ years writing clean, maintainable, and robust code in Python, Scala, Java, or similar coding languages
  • Ability to manage an individual workstream independently
  • Expertise in software engineering concepts and best practices
  • DevOps experience preferred
  • Experience working with cloud data warehouses (Snowflake, Google BigQuery, AWS Redshift, Microsoft Synapse)  preferred
  • Experience working with cloud ETL/ELT tools (Fivetran, dbt, Matillion, Informatica, Talend, etc.)  preferred
  • Experience working with cloud platforms (AWS, Azure, GCP) and container technologies (Docker, Kubernetes) preferred
  • Experience working with Apache Spark preferred
  • Experience preparing data for analytics and following a data science workflow to drive business results preferred
  • Consulting experience strongly preferred
  • Willingness to travel

This position is fully-remote within the United Kingdom.

See more jobs at Aimpoint Digital

Apply for this job

4d

Lead Data Engineer, UK 2024

Aimpoint DigitalLondon, GB Remote
scalasqlDesignazuregitjavac++dockerkubernetespythonAWS

Aimpoint Digital is hiring a Remote Lead Data Engineer, UK 2024

What you will do

Do you enjoy working with clients from different industries to investigate complex business problems and to design end-to-end analytical solutions that will improve their existing processes and ability to derive data-driven insights?

Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. We work alongside the most innovative software providers in the data engineering space to solve our clients' toughest business problems. At Aimpoint Digital, we believe in blending modern tools and techniques with tried-and-true principles to deliver optimal data engineering solutions.

You will:

  • Become a trusted advisor working together with our clients, from data owners and analytic users to C-level executives
  • Work independently or manage small teams to solve complex data engineering use-cases across a variety of industries
  • Design and develop the analytical layer, building cloud data warehouses, data lakes, ETL/ELT pipelines, and orchestration jobs
  • Work with modern tools such as Snowflake, Databricks, Fivetran, and dbt and credentialize your skills with certifications
  • Write code in SQL, Python, and Spark, and use software engineering best-practices such as Git and CI/CD
  • Support the deployment of data science and ML projects into production
    • Note: You will not be developing machine learning models or algorithms

Who you are:

We are building a diverse team of talented and motivated people who deeply understand business problems and enjoy solving them. You are a self-starter who loves working with data to build analytical tools that business users can leverage daily to do their jobs better. You are passionate about contributing to a growing team and establishing best practices.

As a Lead Data Engineer, you will be expected to work independently on client engagements or to lead small team project delivery, take part in the development of our practice, aid in business development, and contribute innovative ideas and initiatives to our company.

  • Degree educated in Computer Science, Engineering, Mathematics, or equivalent experience
  • Experience with managing stakeholders and collaborating with customers
  • Strong written and verbal communication skills required
  • 5+ years working with relational databases and query languages
  • 5+ years building data pipelines in production and ability to work across structured, semi-structured and unstructured data
  • 5+ years data modeling (e.g. star schema, entity-relationship)
  • 5+ years writing clean, maintainable, and robust code in Python, Scala, Java, or similar coding languages
  • Ability to manage an individual workstream independently and ability to manage a small 1-2 person team
  • Expertise in software engineering concepts and best practices
  • DevOps experience preferred
  • Experience working with cloud data warehouses (Snowflake, Google BigQuery, AWS Redshift, Microsoft Synapse) preferred
  • Experience working with cloud ETL/ELT tools (Fivetran, dbt, Matillion, Informatica, Talend, etc.)  preferred
  • Experience working with cloud platforms (AWS, Azure, GCP) and container technologies (Docker, Kubernetes) preferred
  • Experience working with Apache Spark preferred
  • Experience preparing data for analytics and following a data science workflow to drive business results preferred
  • Consulting experience strongly preferred
  • Willingness to travel


This position is fully-remote within the United Kingdom.

See more jobs at Aimpoint Digital

Apply for this job

4d

Principal Data Engineer

sliceBelfast or remote UK
scalanosqlsqlDesignjavapython

slice is hiring a Remote Principal Data Engineer

UK Remote or Belfast

Serial tech entrepreneur Ilir Sela started Slice in 2010 with the belief that local pizzerias deserve all of the advantages of major franchises without compromising their independence. Starting with his family’s pizzerias, we now empower over 18,000 restaurants (that’s nearly triple Domino’s U.S. network!) with the technology, services, and collective power that owners need to better serve their digitally minded customers and build lasting businesses. We’re growing and adding more talent to help fulfil this valuable mission. That’s where you come in.

 

The Challenge to Solve

Provide Slice with up to date data to grow the business and to empower independent pizzeria owners to make the best data driven decisions through insights that ensure future success.

 

The Role

You will be responsible for leading data modelling and dataset development across the team. You’ll be at the forefront of our data strategy. Partnering closely with business and product teams,  to fuel data-driven decisions throughout the company. Your leadership will guide our data architecture expansion, ensuring smooth data delivery and maintaining top-notch data quality. Drawing on your expertise, you’ll steer our tech choices and keep us at the cutting edge of the field. You’ll get to code daily and provide your insights into best practices to the rest of the team.

 

The Team

You’ll work with a team of skilled data engineers daily, providing your expertise to their reviews, as well as working on your own exciting projects with teams across the business. You’ll have a high degree of autonomy and the chance to impact many areas of the business. You will optimise data flow and collection for cross-functional teams and support software developers, business intelligence, and data scientists on data initiatives using this to help to support product launches and supporting Marketing efforts to grow the business. This role reports to the Director of Data Engineering.

 

The Winning Recipe

We’re looking for creative, entrepreneurial engineers who are excited to build world-class products for small business counters. These are the core competencies this role calls for:

  • Strong track record of designing and implementing modern cloud data processing architectures using programming languages such as Java, Scala, or Python and technologies like Spark
  • Expert-level SQL skills
  • Extensive experience in data modelling and design, building out the right structures to deliver for various business and product domains
  • Strong analytical abilities and a history of using data to identify opportunities for improvement and where data can help drive the business towards its goals
  • Experience with message queuing, stream processing using frameworks such as Flink or KStreams and highly scalable big data data stores, as well as storage and query pattern design with NoSQL stores
  • Proven leadership skills, with a track record of successfully leading complex engineering projects and mentoring junior engineers, as well as working with cross-functional teams and external stakeholders in a dynamic environment
  • Familiarity with serverless technologies and the ability to design and implement scalable and cost-effective data processing architectures

 

The Extras

Working at Slice comes with a comprehensive set of benefits, but here are some of the unexpected highlights:

  • Access to medical, dental, and vision plans
  • Flexible working hours
  • Generous time off policies
  • Annual conference attendance and training/development budget
  • Market leading maternity and paternity schemes
  • Discounts for local pizzerias (of course)

 

The Hiring Process

Here’s what we expect the hiring process for this role to be, should all go well with your candidacy. This entire process is expected to take 1-3 weeks to complete and you’d be expected to start on a specific date.

  1. 30 minute introductory meeting
  2. 30 minute hiring manager meeting
  3. 60 minute pairing interview
  4. 45 minute interview interview
  5. 30 minute CTO interview
  6. Offer!

Pizza brings people together. Slice is no different. We’re an Equal Opportunity Employer and embrace a diversity of backgrounds, cultures, and perspectives. We do not discriminate on the basis of race, colour, gender, sexual orientation, gender identity or expression, religion, disability, national origin, protected veteran status, age, or any other status protected by applicable national, federal, state, or local law. We are also proud members of the Diversity Mark NI initiative as a Bronze Member.

Privacy Notice Statement of Acknowledgment

When you apply for a job on this site, the personal data contained in your application will be collected by Slice. Slice is keeping your data safe and secure. Once we have received your personal data, we put in place reasonable and appropriate measures and controls to prevent any accidental or unlawful destruction, loss, alteration, or unauthorised access. If selected, we will process your personal data for hiring /employment processes, as well as our legal obligations. If you are not selected for the job position and you have given consent on the question below (by selecting "Give consent") we will store and process your personal data and submitted documents (CV) to consider eligibility for employment up to 365 days (one year). You have the right to withdraw your previously given consent for storing your personal data and CV in the Slice database considering eligibility for employment for a year. You have the right to withdraw your consent at any time.For additional information and / or exercise of your rights to the protection of personal data, you can contact our Data Protection Officer, e-mail:privacy@slicelife.com

See more jobs at slice

Apply for this job

4d

AI Data Science Associate Director

Blend36Montevideo, Uruguay, Remote
scalasqlazurepythonAWS

Blend36 is hiring a Remote AI Data Science Associate Director

Job Description

 

  • Work with practice leaders and clients to understand business problems, industry context, data sources, potential risks, and constraints 
  • Problem-solve with practice leaders to translate the business problem into a workable Data Science solution; propose different approaches and their pros and cons  
  • Work with practice leaders to get stakeholder feedback, get alignment on approaches, deliverables, and roadmaps 
  • You will work as part of our global Data Science team to provide data driven AI solutions for our customers using state-of-the-art Machine Learning methods and tools. 
  • Create and maintain efficient data pipelines, often within clients’ architecture. Typically, data are from a wide variety of sources, internal and external, and manipulated using SQL, spark, and Cloud big data technologies 
  • Assemble large, complex data sets from client and external sources that meet functional business requirements. 
  • Build analytics tools to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.  
  • Perform data cleaning/hygiene, data QC, and integrate data from both client internal and external data sources on Advanced Data Science Platform. Be able to summarize and describe data and data issues  
  • Conduct statistical data analysis, including exploratory data analysis, data mining, and document key insights and findings toward decision making 
  • Train, validate, and cross-validate predictive models and machine learning algorithms using state of the art Data Science techniques and tools 
  • Document predictive models/machine learning results that can be incorporated into client-deliverable documentation 
  • Assist client to deploy models and algorithms within their own architecture 

Qualifications

 

  • MS degree in Statistics, Math, Data Analytics, or a related quantitative field 
  • 5+ years Professional experience in Advanced Data Science, such as predictive modeling, statistical analysis, machine learning, text mining, geospatial analytics, time series forecasting, optimization 
  • Demonstrated Experience with NLP and other components of AI
  • Experience implementing AI solutions
  • Experience with one or more Advanced Data Science software languages (Python, R, SAS)  
  • Proven ability to deploy machine learning models from the research environment (Jupyter Notebooks) to production via procedural or pipeline approaches 
  • Experience with SQL and relational databases, query authoring and tuning as well as working familiarity with a variety of databases including Hadoop/Hive 
  • Experience with spark and data-frames in PySpark or Scala 
  • Strong problem-solving skills; ability to pivot complex data to answer business questions. Proven ability to visualize data for influencing. 
  • Comfortable with cloud-based platforms (AWS, Azure, Google) 
  • Experience with Google Analytics, Adobe Analytics, Optimizely a plus 

See more jobs at Blend36

Apply for this job

4d

Software Engineer (Scala/Kotlin/Typescript)

Tech9Remote
kotlinscalaDesignscrumtypescriptangularAWS

Tech9 is hiring a Remote Software Engineer (Scala/Kotlin/Typescript)

Software Engineer (Scala/Kotlin/Typescript) - Tech9 - Career Page (function(d, s, id) { var js, iajs = d.getElementsByTagName(s)[0];

See more jobs at Tech9

Apply for this job

5d

Software Engineer, Compliance

GeminiRemote (USA)
remote-firstscalaDesignAWS

Gemini is hiring a Remote Software Engineer, Compliance

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Service Fundamentals (Compliance)

The Role: Software Engineer

Responsibilities:

  • Develop new products and product features on the Gemini platform, as part of a tight knit team of seven to eight developers.
  • Write automated tests to ensure the operation and correctness of new product features.
  • Provide technical input and knowledge to the planning, design, and requirements process for new products and features.
  • Review other software engineers’ code for correctness, style, and information security concerns.
  • Improve the performance, maintainability, and operations of the Gemini codebase by engaging in occasional refactoring and upgrade projects.
  • Support your team’s production software by responding to an occasional alert or bug report.

Minimum Qualifications:

  • At least 2+ years of software engineering experience.
  • Proficiency with the JVM (Scala preferred).
  • The ability to adapt and handle multiple competing priorities in collaboration with peers.
  • A customer and product-focused mindset, with the ability to make well-reasoned tradeoffs between speed and quality.
  • A proven track record of working with distributed systems.
  • Familiarity writing highly observable, well monitored code.

Preferred Qualifications:

  • Familiarity with AWS cloud infrastructure.
  • Interest in working with Functional Programming paradigms.
  • Prior experience working with gRPC and/or protobuf.
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $120,000 - $150,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

Apply for this job

5d

[CSR] Test Automation Engineer

Software MindBucharest, Romania, Remote
agilescalasqlRabbitMQuiapiqajavapython

Software Mind is hiring a Remote [CSR] Test Automation Engineer

Job Description

Project – the aim you’ll have

If you like QA and enjoy building scripts that make things click - we might have a spot for you. One of our projects is looking for a QA Automation Engineer with a focus on API Automation. You'll be expected to know your way around the service layer of an application, have an idea of how automation works in theory and code, as well as know the QA principles that are needed to be able to contribute to an ever-growing automation test suite for business-level applications. In return, we'll provide the perfect technical playground in which you can enhance your skills and knowledge, so together we can bring really interesting projects to life.

Position - how you'll contribute

  • Participate in daily Agile team meetings.
  • Maintain and expand existing automated test scenarios.
  • Review and analyze results from nightly test runs.
  • Investigate and resolve failures in test runs, report defects as necessary.
  • Develop automated tests for new features.
  • Engage in code reviews, coverage assessments, and peer review sessions.
  • Analyze requirements for new features to fully understand expected behavior and interactions.
  • When possible, refactor and enhance the quality of existing automation frameworks.
  • Define test coverage by decomposing complex business features into testable automation scenarios.

Qualifications

Expectations - the experience you need

  • Minimum of two years of hands-on experience in Automation testing, preferably focused on API Automation using REST Assured.
  • Proven programming knowledge, adhering to coding best practices, with proficiency in at least one of the following languages: Java, Scala, Python, or a similar programming language.
  • Strong understanding of automation best practices and the role of automation in Quality Assurance, as well as CI and CD principles.
  • Demonstrated experience and proficiency in testing REST APIs, preferably with practical experience in testing APIs in previous projects.
  • Familiarity with POSTMAN or other similar tools for API interaction;
  • Experience working with version control tools, particularly GIT.
  • Proficiency in database technologies, including SQL and/or NoSQL.

Additional skills - the edge you have

  • Basic knowledge of HTML and CSS.
  • Familiarity with UI Automation.
  • Experience working with Microservice architecture in conjunction with a message queue (e.g., Kafka, RabbitMQ, or similar technologies).
  • Excellent communication skills, both written and verbal, in English. Ability to effectively communicate with internal stakeholders at various levels and empathize with their perspectives.
  • Analytical and logical thinking skills, with a natural curiosity to explore the bigger picture. Ability to ask insightful questions to understand problems and assess the strengths and weaknesses of systems.

See more jobs at Software Mind

Apply for this job

5d

[VAD] Senior Data Engineer

Software MindBuenos Aires, Argentina, Remote
scalapostgresAWS

Software Mind is hiring a Remote [VAD] Senior Data Engineer

Job Description

Project – the aim you have

We are looking for a Senior Data Engineer to help us build a solid and scalable data infrastructure and analytical environment. The ideal candidate enjoys data wrangling, optimizing data pipelines, and building solutions for data collection and analysis in parallelized and scaled-out environments. Our infrastructure currently processes billions of data points and hosts databases well into terabyte-size ranges, making this an interesting and challenging opportunity.

If you enjoy working with cutting-edge technologies in a fast-paced environment this opportunity is for you!

Qualifications

Expectations – the experience you need

  • + 5 years of experience in a Data Engineer role.
  • BS/BA in Computer Science.
  • Experience developing cloud-based pipelines.
  • Strong experience with Spark.
  • Experience with Scala, and exposure to Python.
  • Experience with developing and deploying cloud-based data solutions.
  • Experience in any of the following is considered to be an asset: streaming technology such as Kafka or Google Pub/Sub, Databricks, AWS / GCP, Postgres, Protobuf, Snowflake or other Data Warehousing solutions, and Test Driven Development.
  • Ability to operate with strong ownership and independence.

 

Our benefits 

  • Educational resources
  • Flexible schedule and Work From Anywhere
  • Referral Program 
  • Supportive and chill atmosphere

 

Position at: Software Mind Latam

See more jobs at Software Mind

Apply for this job

7d

Manager Engineering

VericastAustin, TX, Remote
scalapostgresjavapython

Vericast is hiring a Remote Manager Engineering

Job Description

Vericast is seeking a Manager of Software Engineering for our Advertising Optimization and Advertising Integrations and Data teams. You will provide technical and people leadership, be hands-on and own delivery of critical digital advertising features and contribute to the roadmap for our adtech platform and overall company goals. If you want work with great teams, have high-level input to your own roadmap, and low-level control of delivering a top-notch Digital Advertising Platform, we want to talk to you. If you want leverage great ideas across many teams of developers to improve digital marketing product delivery, value, and profit, we want to talk to you. If you ever worked on services and platforms and thought "we could make this more efficient", we want to talk to you.

The Ad Optimization team develops models and systems to optimize programmatic bidding, analyzing vast amounts of data (over 100 billion signals per day) to enhance ad serving efficiency. Our team works alongside data scientists to create algorithms that make real-time bidding decisions and build controllers to measure success and adapt to industry changes. Using Scala and Python with Apache Spark, they leverage extensive data sources to secure optimal ad placements.

The Ad Integrations and Data team builds and maintains pipelines handling data flowing into and out of our digital advertising system, processing large volumes of data from partners and vendors daily. We help make multiple terabytes of data and intelligence available to our digital advertising operations teams as well as clients. Our tech stack consists of Python with PySpark, Scala, Apache Flink, and Postgres for our reporting and dashboarding solutions.

 

Responsibilities:

  • Manage and lead teams of passionate data engineers responsible for the product development, maintenance, performance optimization, and scaling of our optimization algorithms, data ingestion, and ETL pipelines.
  • Set strategy and contribute to the roadmap for our adtech platform and overall company goals.
  • Monitor the performance of both teams and their respective components of the platform, and work to continuously improve performance and scalability.
  • Engage with Product Management and other Engineering teams to ensure client success.
  • Manage the hiring and development of team members, including setting performance goals, providing feedback and coaching, and conducting performance reviews.
  • Foster a culture of innovation and collaboration that encourages engineers to share ideas and feedback with each other.
  • Act as a technical advisor and mentor for team members, providing guidance and support as needed.
  • Remain current with industry trends and developments and leverage this knowledge to inform the direction of both teams and their respective components of the platform.

Qualifications

EDUCATION:

BA/BS in Computer Science or other technical discipline (e.g., Engineering, Mathematics, or Physics) and/or equivalent relevant and high-performing work experience

 

EXPERIENCE:

  • 3+ years of experience managing high-performing software teams. 
  • 5+ years of experience in software development and delivery.

 

KNOWLEDGE/SKILLS/ABILITIES:

  • Experience in coaching/mentoring software engineers
  • Strong organizational skills and ability to keep track of multiple streams of work at once.
  • Experience working with cross-functional teams, including product management and data science/analytics.
  • Familiarity with industry-standard tools and technologies such as Scala, Python, Java, Apache Spark, Postgres

See more jobs at Vericast

Apply for this job

10d

Principal Data Platform Architect (Data Applications), Field CTO

scalasqlDesignazurerubyjavadockerkubernetespythonAWSjavascript

snowflakecomputing is hiring a Remote Principal Data Platform Architect (Data Applications), Field CTO

Build the future of data. Join the Snowflake team.

There is only one Data Cloud. Snowflake’s founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn’t stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow. 

Our Sales Engineering organization is seeking a Principal Data Platform Architect focused in Data Apps to join our Field CTO Office who can provide leadership in working with both technical and business executives in the design and architecture of the Snowflake Cloud Data Platform as a critical component of their enterprise data architecture and overall ecosystem.

In this role you will work with sales teams, product management, and technology partners to leverage best practices and reference architectures highlighting Snowflake’s Cloud Data Platform as a core technology enabling platform for the emerging Data Application workload throughout an organization.

As a Principal Data Platform Architect focused in Data Apps, you must share our passion and vision in helping our customers and partners drive faster time to insight through Snowflake’s Cloud Data Platform, thrive in a dynamic environment, and have the flexibility and willingness to jump in and get things done. You are equally comfortable in both a business and technical context, interacting with executives and talking shop with technical audiences. 

IN THIS ROLE YOU WILL GET TO: 

  • Apply your data application architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners
  • Work with our product management, partners, and sales teams in order to drive innovation in our Cloud Data Platform 
  • Partner with sales team and channel partners to understand the needs of our customers,  strategize on how to navigate and accelerate winning sales cycles, provide compelling value-based enterprise architecture deliverables and working sessions to ensure customers are set up for success operationally, and support strategic enterprise pilots / proof-of-concepts 
  • Collaborate closely with our Product team to effectively influence the Cloud Data Platform product roadmaps based on field team and customer feedback
  • Partner with Product Marketing teams to  spread awareness and support pipeline building via customer roundtables,  conferences, events, blogs, webinars, and whitepapers
  • Contribute to the creation of reference architectures, blueprints, best practices, etc. based on field experience to continue to up-level and enable our internal stakeholders and Snowflake Powered by Program

ON DAY ONE, WE WILL EXPECT YOU TO HAVE: 

  • 10+ years of full stack application architecture experience with a deep understanding of how to scale applications with large data sets
  • 3+ years of Cloud Provider (AWS, Azure, GCP) customer facing application development experience
  • 1+ years of Snowflake experience
  • 2+ years of experience in building/utilizing various data connectivity tools like connectors & drivers which enable access to a library of data sources.
  • Outstanding presentation skills to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
  • Broad range of experience with on-prem and/or cloud databases
  • Experience with decomposing application components and determining how to leverage various technologies found with modern cloud services.
  • Knowledge of SQL, and Javascript, Python, Java, Scala, Go, Ruby or other languages
  • Knowledge of platform and container technologies, including Docker, Kubernetes, or Infrastructure as Code (IaC)
  • Knowledge of data formats or languages that manage data, including SQL, JSON, AVRO, Apache Parquet or XML
  • Experience with Open Data table formats like Iceberg and data processing engine like Spark is a plus.
  • Bachelor’s Degree required, Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred 

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. 

How do you want to make your impact?

Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

See more jobs at snowflakecomputing

Apply for this job

12d

Consultant Data Intelligence - H/F - CDI

TalanParis, France, Remote
tableauscalaazureAWS

Talan is hiring a Remote Consultant Data Intelligence - H/F - CDI

Description du poste

Si vous recherchez un univers 100% Data, bienvenue au sein du pôle Data Intelligence Solutions - LE pôle Data du groupe Talan !

VOTRE ROLE SUR NOS PROJETS :

Au sein d’équipes projets, vous pourrez être en charge des missions suivantes :

-       Pilotage de projets d’envergure dans des écosystèmes Data/Big Data/Cloud

-       Cadrage du besoin, conception et analyses fonctionnelle et technique

-       Benchmark de solutions et conseil auprès de nos clients sur les solutions technologiques à adopter, en lien avec leurs besoins

-       Conseil sur l’architecture, l’exploitation et la gouvernance des données

-       Maîtrise complète de la plateforme, installation et dimensionnement

-       Création d’applications, manipulation de la donnée, modélisation et développement de rapports et tableaux de bord

-       Rédaction de documentation fonctionnelle et/ou technique

-       Formation d’utilisateurs(rices) et équipe projet

-       Participation à des avant-ventes

VOTRE ROLE CHEZ TALAN :

-       Réalisation de POC (Proof Of Concept)

-       Être ambassadeur(drice) en participant à des événements partenaires

-       Participation à des projets internes et partage de connaissances au sein de nos équipes.

Ensemble réalisons de nouveaux projets Talantueux ! 

Qualifications

VOTRE PROFIL :

De formation Bac+5, vous disposez d’une expérience confirmée de minimum 5 ans dans la data et disposez des compétences suivantes :

-       Vous avez développé une expertise technique et fonctionnelle sur l’une des technologies suivantes : MSBI, Power BI, Tableau, Informatica, Talend, Dbt, Hadoop, Spark, Scala, Hive, Kafka, Snowflake, Azure DB, AWS, GCP, Redshift, Athena, TM1, Jedox, Big Query, Looker…

-       Vous maîtrisez les concepts de base de l'informatique décisionnelle (modélisation, alimentation, restitution...) et/ou du big data

-       Vous avez déjà préparé et mené des ateliers métiers

-       Vous êtes capable d’appréhender le contexte métier

-       La connaissance du contrôle de gestion est un plus

-       Vous avez formé des utilisateurs et équipes projet

-       Une ou plusieurs certifications seraient fortement appréciées

-       Français lu, écrit, parlé

-       L'anglais est un plus.

VOTRE SOUHAIT D’EVOLUTION :

Si vous êtes passionné(e) par l’innovation, et souhaitez élargir vos compétences technico-fonctionnelles dans la data, accéder à des fonctions de management de projet et d’équipe, participer au développement commercial et organisationnel, ou tout simplement pouvoir valoriser vos prises d’initiatives et développer de nouveaux terrains de jeux, alors rejoignez-nous !

See more jobs at Talan

Apply for this job

12d

Data Engineer Azure

4 years of experience2 years of experienceagileBachelor's degreetableaujirascalaairflowpostgressqloracleDesignmongodbpytestazuremysqljenkinspythonAWS

FuseMachines is hiring a Remote Data Engineer Azure

Data Engineer Azure - Fusemachines - Career Page
13d

Senior MLOps Engineer

EquipmentShareRemote; Columbia, MO
Master’s DegreescalaDesignazuregitjavac++dockerkubernetesjenkinspythonAWS

EquipmentShare is hiring a Remote Senior MLOps Engineer

EquipmentShare is Hiring a Senior MLOps Engineer.

EquipmentShare is searching for a Senior MLOps Engineer for a remote opportunity supporting the Data Platform team as it continues to grow.  

Primary Responsibilities

  • Collaborate with scientists and data engineers to design, develop, and deploy scalable ML models and pipelines
  • Work with scientists to build and maintain end-to-end ML pipelines for data ingestion, preprocessing, feature engineering, model training, evaluation, and deployment.
  • Automate continuous integration and continuous deployment (CI/CD) pipelines for scientists to self-service model and feature deployment
  • Develop monitoring and logging solutions to track model performance, data quality, and system health in production environments
  • Work with Scientists to optimize ML workflows for efficiency, scalability, and cost-effectiveness.
  • Automate infrastructure provisioning, configuration management, and resource scaling using infrastructure-as-code (IaC) tools
  • Troubleshoot issues related to data, models, infrastructure, and deployments in production environments
  • Mentor junior members of the Data Platform team and provide technical guidance and best practices
  • Stay updated on the latest advancements and trends in MLOps, machine learning, and cloud technologies

Why We’re a Better Place to Work

  • Competitive salary.
  • Medical, Dental and Vision coverage for full-time employees.
  • 401(k) and company match.
  • Unlimited paid time off (PTO) plus company paid holidays.
  • Generous paid parental leave
  • State of the art onsite gym (Corporate HQ) with instructor led-courses/Gym stipend for remote employees.
  • Seasonal and year round wellness challenges.
  • Company sponsored events (annual family gatherings, happy hours and more).
  • Volunteering and local charity initiatives that help you nurture and grow the communities you call home. Employees receive 16 hours of paid volunteer time per year. 
  • Opportunities for career and professional development with conferences, events, seminars and continued education. 

About You 

Our mission to change an entire industry is not easily achieved, so we only hire people who are inspired by the goal and up for the challenge. In turn, our employees have every opportunity to grow with us, achieve personal and professional success and enjoy making a tangible difference in an industry that’s long been resistant to change. 

Skills & Qualifications 

  • Bachelor’s or Master’s degree, or equivalent practical experience, in computer science, machine learning, software engineering, data engineering, DevOps, or related field
  • 5+ years of experience working in MLOps, machine learning engineering, software engineering, data engineering, DevOps, or related roles
  • Demonstrated knowledge and experience in building a modern data science, experimentation and machine learning technology stack for a data-driven company
  • Proficiency in programming languages such as Python, Java, or Scala
  • Strong understanding of cloud computing platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes)
  • Knowledge of DevOps practices, including infrastructure automation, configuration management, and monitoring
  • Excellent problem-solving skills and attention to detail
  • Effective communication and collaboration skills, with the ability to work in a fast-paced, team-oriented environment
  • Experience with version control systems (e.g., Git) and CI/CD tools (e.g., Jenkins, GitLab CI/CD)
  • Experience with ML frameworks and libraries (e.g., TensorFlow, PyTorch, scikit-learn).
  • Experience with Snowflake is preferred but not required
  • Must be qualified to work in the United States or the United Kingdom - we are not sponsoring any candidates at this time

EquipmentShare is committed to a diverse and inclusive workplace. EquipmentShare is an equal opportunity
employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation,
protected veteran status, disability, age, or other legally protected status.

#LI-Remote

 

See more jobs at EquipmentShare

Apply for this job

14d

Machine Learning Platform Architect (US)

SignifydUnited States (Remote); New York City, NY
Bachelor's degreeBachelor degree5 years of experience10 years of experiencescalaairflowsqlDesignjavapythonAWS

Signifyd is hiring a Remote Machine Learning Platform Architect (US)

Who Are You

We are seeking a highly skilled and experienced ML Platform Architect to join our dynamic and growing data platform team. As an ML Platform Architect, you will play a crucial role in strengthening and expanding the core of our data products. We want you to help us scale our business, to data-driven decisions, and to contribute to our overall data strategy. You will work alongside talented data platform engineers to envision how all the data elements from multiple sources should fit together and then execute that plan. The ideal candidate must: 

  • Effectively communicate complex data problems by tailoring the message to the audience and presenting it clearly and concisely. 
  • Balance multiple perspectives, disagree, and commit when necessary to move key company decisions and critical priorities forward.
  • Have a profound comprehension of data quality, governance, and analytics.
  • Have the ability to work independently in a dynamic environment and proactively approach problem-solving.
  • Be committed to driving positive business outcomes through expert data handling and analysis.
  • Be an example for fellow engineers by showcasing customer empathy, creativity, curiosity, and tenacity.
  • Have strong analytical and problem-solving skills, with the ability to innovate and adapt to fast-paced environments.
  • Design and build clear, understandable, simple, clean, and scalable solutions.

What You'll Do

  • Modernize Signifyd’s Machine Learning (ML) Platform to scale for resiliency, performance, and operational excellence working closely with Engineering and Data Science teams across Signifyd’s R&D group.
  • Create and deliver a technology roadmap focused on advancing our data processing capabilities, which will support the evolution of our real-time data processing and analysis capabilities.
  • Work alongside ML Engineers, Data Scientists, and other Software Engineers to develop innovative big data processing solutions for scaling our core product for eCommerce fraud prevention.
  • Take full ownership of significant portions of our data processing products, including collaborating with stakeholders on machine learning models, designing large-scale data processing solutions, creating additional processing facets and mechanisms, and ensuring the support of low-latency, high-quality, high-scale decisioning for Signifyd’s flagship product.
  • Architect, deploy, and optimize Databricks solutions on AWS, developing scalable data processing solutions to streamline data operations and enhance data solution deployments.
  • Implement data processing solutions using Spark, Java, Python, Databricks, Tecton, and various AWS services (S3, Redshift, EMR, Athena, Glue).
  • Mentor and coach fellow engineers on the team, fostering an environment of growth and continuous improvement.
  • Identify and address gaps in team capabilities and processes to enhance team efficiency and success.

What You'll Need

  • Ideally has over 10 years of experience in data engineering, including at least 5 years of experience as a data or machine learning architect or lead. Have successfully navigated the challenges of working with large-scale data processing systems.
  • Deep understanding of data processing, comfortable working with multi-terabyte datasets, and skilled in high-scale data ingestion, transformation, and distributed processing, with strong Apache Spark or Databricks experience.
  • Experience in building low-latency, high-availability data stores for use in real-time or near-real-time data processing with programming languages such as Python, Scala, Java, or JavaScript/TypeScript, as well as data retrieval using SQL and NoSQL.
  • Hands-on expertise in data technologies with proficiency in technologies such as Spark, Airflow, Databricks, AWS services (SQS, Kinesis, etc.), and Kafka. Understand the trade-offs of various architectural approaches and recommend solutions suited to our needs.
  • Experience with the latest technologies and trends in Data, ML, and Cloud platforms.
  • Demonstrable ability to lead and mentor engineers, fostering their growth and development. 
  • You have successfully partnered with Product, Data Engineering, Data Science and Machine Learning teams on strategic data initiatives.
  • Commitment to quality, you take pride in delivering work that excels in data accuracy, performance and reliability, setting a high standard for the team and the organization.

#LI-Remote

Benefits in our US offices:

  • Discretionary Time Off Policy (Unlimited!)
  • 401K Match
  • Stock Options
  • Annual Performance Bonus or Commissions
  • Paid Parental Leave (12 weeks)
  • On-Demand Therapy for all employees & their dependents
  • Dedicated learning budget through Learnerbly
  • Health Insurance
  • Dental Insurance
  • Vision Insurance
  • Flexible Spending Account (FSA)
  • Short Term and Long Term Disability Insurance
  • Life Insurance
  • Company Social Events
  • Signifyd Swag

We want to provide an inclusive interview experience for all, including people with disabilities. We are happy to provide reasonable accommodations to candidates in need of individualized support during the hiring process.

Signifyd provides a base salary, bonus, equity and benefits to all its employees. Our posted job may span more than one career level, and offered level and salary will be determined by the applicant’s specific experience, knowledge, skills, and abilities, as well as internal equity and alignment with market data.

USA Base Salary Pay Range
$230,000$250,000 USD

See more jobs at Signifyd

Apply for this job

14d

Analista de Desenvolvimento de Software Java Sênior

ExperianSão Paulo, Brazil, Remote
scalanosqlpostgresDesignmongodbscrumgitjavadockerelasticsearchmysqljenkinsAWS

Experian is hiring a Remote Analista de Desenvolvimento de Software Java Sênior

Job Description

Área:Marketing Services Development
Subárea:Targeting/EDQ

Como será o seu dia a dia?

  •  Irá atuar em uma squad, participando efetivamente de cerimonias, discussões, apoio nas tomadas de decisões e resolução de conflitos;
  • Irá atuar na co-criação de novas soluções em conjunto com nosso time de produtos;
  • Será responsável por assegurar a qualidade e segurança do software entregue;
  • Comunicar o design de uma forma que os outros membros da equipe compreendam;
  • Integrar o sistema com os novos componentes de software produzidos ou alterados.

Quais serão suas principais entregas?

  • Desenvolver softwares para atendimento das necessidades internas;
  • Atuar na manutenção de soluções existentes e propor melhorias nas mesmas;
  • Participar de discussões técnicas para criar softwares de alta qualidade e alto desempenho;
  • Ajudar na concepção e arquitetura das aplicações de software;
  • Implementar as melhores práticas técnicas com qualidade e segurança;
  • Realizar testes unitários, teste funcionais e automação de testes das soluções desenvolvidas;
  • Seguir as orientações da arquitetura de referência;
  • Promover boas práticas e aprendizado contínuo;
  • Documentar os projetos de software;
  • Reutilização de componentes.

Qualifications

O que estamos buscando em você!

  • Experiência e conhecimentos avançados em Java;
  • Experiência na arquitetura de micro-serviços;
  • Experiência na Stack Spring (Spring Framework 4.0+, SpringBoot, Spring Data, etc);
  • Experiência com bancos de dados relacionais (Postgres e MySQL);
  • Experiência com soluções e recursos AWS;
  • Experiência com Maven;
  • Experiência em controle de versionamento com Git;
  • Experiência com filas;
  • Experiência com CI/CD (Jenkins);
  • Conhecimento de testes automatizados;
  • Conhecimento em Veracode;
  • Conhecimento/Experiência com frameworks ágeis (Scrum, Kanban, Lean);
  • Familiaridade com containerização (Docker, Kubbernets);
  • Familiaridade com monitoramento (Grafana, Kibana);
  • Familiaridade com desenvolvimento responsivo;
  • Boa comunicação escrita e verbal;
  • Pessoa antenada às novidades da área, curiosa e responsável.

Desejável conhecimento em:

  • Ferramentas/soluções para Marketing;
  • Bancos de dados NOSQL (MongoDB e Elasticsearch);
  • Spark e/ou Scala 2;
  • Hadoop (HDFS, MapReduce, Spark, Hive, Hbase);

See more jobs at Experian

Apply for this job

16d

Embedded Escalation Engineer (EEE) - COSMOSDB - (GS)

ITScoutLATAM, AR Remote
agilescalanosqlsqlDesignmobileazurejavac++.netpythonjavascript

ITScout is hiring a Remote Embedded Escalation Engineer (EEE) - COSMOSDB - (GS)

⚠️Only available for #residents of #Latinamerica⚠️


Our client builds smart technology solutions through the combination of artificial intelligence, mobile, and web development for companies in the United States, Canada & Latam. It´s a technology company headquartered in Costa Rica. With operations throughout LATAM. Their core focus is building intelligent tech solutions to help our customers be more efficient in optimizing internal digital processes.

About the job Embedded Escalation Engineer (EEE) - COSMOSDB

Overview

Interested in being on the cutting edge of Cloud Services helping build a NoSQL database service with limitless elastic scale? Then come join us as an Embedded Escalation Engineer (EEE) working with Azure Cosmos DB. This is a great opportunity to be part of a diverse, inclusive, agile team and have high impact.

Key Responsibilities:

As an Embedded Escalation Engineer (EEE), You will have the following key responsibilities:

  • Lead engineering investigations to bring quicker issue resolution to support incidents impacting our customers and improve customer experience.
  • Build solutions, help create tools, help automate issue detection and diagnosis, to enable customers or support to self-resolve the issues.
  • Identify emerging trends or re-occurring escalation scenarios and drive engineering opportunities to mitigate and/or eliminate them from the workflow. This can include a range of potential work item categories; such as self-healing mechanisms, self-serve, transparency, automation, and/or increasing the capabilities for Azure support.
  • Contribute to product improvements by filing impactful bugs, design change requests and helping developers to fix and ship them to production, preventing customers from being impacted.
  • As a trusted advisor to the Microsoft Azure engineering team and the Serviceability Technology Lead, you will suggest changes to future versions to better equip our support teams as well as our partners and customers and help influence in-market solutions today.
  • As a customer ambassador, you will also partner with engineering leadership for strategic technical, architectural and design discussions, and drive strategic thought leadership for Azure Diagnostics tools creation and usage worldwide bringing the customer voice to the center of impactful decisions. These strategic areas of focus will target our highest impact pain points for our partners, customers and team members.
  • Able to work well in challenging situations while exhibiting flexibility and ability tolerate and manage through ambiguity and uncertainty.
  • Beyond extensive technical and product focus, this role requires the ability to frame and communicate issues and recommendations clearly and concisely, show exceptional attention to detail, and demonstrate the ability to build broad relationships with the right influencers, leveraging those relationships to impact key business results.

Required Qualifications:

  • 3+ years of experience in a customer-facing or support role in any of the following: technical escalation support, product support, developer support, IT DevOps, IT Admin/support, Systems Development, or Consulting or IT/Network Operations.
  • 2+ years of experience in one or more of the following:
    • Previous experience working with NoSQL
    • Experience with Hadoop or another Big Data/Analytics technology
  • Microsoft Azure Platform:
    • Cloud Computing
    • Microsoft Azure architecture and its components (Fabric, Compute, Storage, RDOS, Management Portal)
  • Microsoft Big Data services
    • Java, JavaScript, Python, R, Scala, REST concepts, C/C++ and debugging
  • Familiarity with development: tools, language, process, methods, troubleshooting
  • Experience with Data Integration solutions and services
  • Experience with Open Source technology preferred
  • Development/Coding:
    • Experience with C#, JAVA, .NET, PowerShell, CLI, Microsoft Azure SQL
    • Service engineering and/or DevOps experience at internet scale involving user data and/or software development for an enterprise level product.
  • Superior problem solving and troubleshooting skills, an ability to use various data collection tools and methodologies to analyze problems and develop solutions
  • Excellent spoken and written English

Preferred Qualifications:

  • Experience in a Tier 2/3 environment is preferred.
  • BS in computer science or engineering or equivalent industry experience is preferred..

Soft Skills:

  • Passion for technology and customer supportability.
  • Leadership - handle technically challenging and politically hot customer situations.
  • Strong communications skills - excellent spoken and written English communication skills and the ability to present complex technical issues clearly and concisely to a general audience.
  • Ability to drive meetings and discussions remotely with authority.
  • Ability to develop and nurture relationships over long distances and remote technologies like Skype.
  • Ability to partner within virtual teams and execute multiple technical initiatives simultaneously.
  • Ability to work collaboratively with the Engineering teams to drive architectural changes to improve stability of environments.
  • Ability to prioritize core role responsibilities vs. other work requests received.
  • Logical and critical thinking.
  • Ability to deal with ambiguity under continual deadline constraints.


See more jobs at ITScout

Apply for this job

16d

Machine Learning Engineer, New Grad

SamsaraRemote - US
scalajavac++pythonbackend

Samsara is hiring a Remote Machine Learning Engineer, New Grad

Who we are

Samsara (NYSE: IOT) is the pioneer of the Connected Operations™ Cloud, which is a platform that enables organizations that depend on physical operations to harness Internet of Things (IoT) data to develop actionable insights and improve their operations. At Samsara, we are helping improve the safety, efficiency and sustainability of the physical operations that power our global economy. Representing more than 40% of global GDP, these industries are the infrastructure of our planet, including agriculture, construction, field services, transportation, and manufacturing — and we are excited to help digitally transform their operations at scale.

Working at Samsara means you’ll help define the future of physical operations and be on a team that’s shaping an exciting array of product solutions, including Video-Based Safety, Vehicle Telematics, Apps and Driver Workflows, Equipment Monitoring, and Site Visibility. As part of a recently public company, you’ll have the autonomy and support to make an impact as we build for the long term. 

Recent awards we’ve won include:

Glassdoor's Best Places to Work 2024

Best Places to Work by Built In 2024

Great Place To Work Certified™ 2023

Fast Company's Best Workplaces for Innovators 2023

Financial Times The Americas’ Fastest Growing Companies 2023

We see a profound opportunity for data to improve the safety, efficiency, and sustainability of operations, and hope you consider joining us on this exciting journey. 

Click hereto learn more about Samsara's cultural philosophy.

About the role:

The Samsara AI team builds end-to-end ML applications to power different product pillars at Samsara. As a Machine Learning Engineer, you will be responsible for developing ML solutions to increase the safety, efficiency and sustainability of the physical operations. You will work closely with various engineering teams across ML, full-stack, firmware as well as cross functional partners to deliver core infrastructure, services, and optimizations.

This role is open to candidates residing in the US except the San Francisco Bay Area (125 mi. radius from 1 De Haro St, San Francisco) and NYC Metro Area (50 mi. radius from 131 W 55th St, New York).

You should apply if:

  • You want to impact the industries that run our world: The software, firmware, and hardware you build will result in real-world impact—helping to keep the lights on, get food into grocery stores, and most importantly, ensure workers return home safely.
  • You want to build for scale: With over 2.3 million IoT devices deployed to our global customers, you will work on a range of new and mature technologies driving scalable innovation for customers across industries driving the world's physical operations.
  • You are a life-long learner: We have ambitious goals. Every Samsarian has a growth mindset as we work with a wide range of technologies, challenges, and customers that push us to learn on the go.
  • You believe customers are more than a number:Samsara engineers enjoy a rare closeness to the end user and you will have the opportunity to participate in customer interviews, collaborate with customer success and product managers, and use metrics to ensure our work is translating into better customer outcomes.
  • You are a team player: Working on our Samsara Engineering teams requires a mix of independent effort and collaboration. Motivated by our mission, we’re all racing toward our connected operations vision, and we intend to win—together.

Click hereto learn about what we value at Samsara. 

In this role, you will: 

  • Build and improve the accuracy of ML models, including retraining and optimizing open-source models to solve Samsara-specific problems
  • Work with petabyte-scale data from Samsara camera and sensor devices to develop new models
  • Optimize models for inference on the backend and/or on edge devices
  • Partner with hardware and full-stack teams to deploy model for optimal performance and cost
  • Stay connected to industry and academic research and adopt novel technology that suits Samsara’s needs.
  • Collaborate with PM to translate customer needs to ML solutions
  • Champion, role model, and embed Samsara’s cultural principles (Focus on Customer Success, Build for the Long Term, Adopt a Growth Mindset, Be Inclusive, Win as a Team) as we scale globally and across new offices

Minimum requirements for the role:

  • BS or MS in Computer Science or other relevant field
  • Proficiency in one or more common languages (e.g., C++, Golang, Java, Python, Scala) 
  • Proficiency with common ML tools (e.g., Spark, TensorFlow, PyTorch)
  • Comfortable with full-stack / backend development code to build a strong understanding of underlying data structures and other dependencies

An ideal candidate also has:

  • Experience building, deploying, and optimizing ML models on the edge
  • Experience building end-to-end ML applications from scratch 
  • Expertise in optimizing distributed model training with GPUs
  • Ph.D. in Computer Science or quantitative discipline (e.g., Applied Math, Physics, Statistics)

#LI-DNI

Samsara’s Compensation Philosophy:Samsara’s compensation program is designed to deliver Total Direct Compensation (based on role, level, and geography) that is at or above market. We do this through our base salary + bonus/variable + restricted stock unit awards (RSUs) for eligible roles.  For eligible roles, a new hire RSU award may be awarded at the time of hire, and additional RSU refresh grants may be awarded annually. 

We pay for performance, and top performers in eligible roles may receive above-market equity refresh awards which allow employees to achieve higher market positioning.

The range of annual base salary for full-time employees for this position is below. Please note that base pay offered may vary depending on factors including your city of residence, job-related knowledge, skills, and experience.
$109,480$149,000 USD

At Samsara, we welcome everyone regardless of their background. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender, gender identity, sexual orientation, protected veteran status, disability, age, and other characteristics protected by law. We depend on the unique approaches of our team members to help us solve complex problems. We are committed to increasing diversity across our team and ensuring that Samsara is a place where people from all backgrounds can make an impact.

Benefits

Full time employees receive a competitive total compensation package along with employee-led remote and flexible working, health benefits, Samsara for Good charity fund, and much, much more. Take a look at our Benefits site to learn more.

Accommodations 

Samsara is an inclusive work environment, and we are committed to ensuring equal opportunity in employment for qualified persons with disabilities. Please email accessibleinterviewing@samsara.com or click hereif you require any reasonable accommodations throughout the recruiting process.

Flexible Working 

At Samsara, we embrace a flexible working model that caters to the diverse needs of our teams. Our offices are open for those who prefer to work in-person and we also support remote work where it aligns with our operational requirements. For certain positions, being close to one of our offices or within a specific geographic area is important to facilitate collaboration, access to resources, or alignment with our service regions. In these cases, the job description will clearly indicate any working location requirements. Our goal is to ensure that all members of our team can contribute effectively, whether they are working on-site, in a hybrid model, or fully remotely. All offers of employment are contingent upon an individual’s ability to secure and maintain the legal right to work at the company and in the specified work location, if applicable.

Fraudulent Employment Offers

Samsara is aware of scams involving fake job interviews and offers. Please know we do not charge fees to applicants at any stage of the hiring process. Official communication about your application will only come from emails ending in ‘@samsara.com’ or ‘@us-greenhouse-mail.io’. For more information regarding fraudulent employment offers, please visit our blog post here.

Apply for this job