Job Description
NextSense is a cloud based engineering organization with all of our data infrastructure in Google Cloud (GCP) leveraging BigQuery, Compute Engine and Kubernetes. This is a great opportunity to come in and help us redesign or rework our existing data infrastructure.
Responsibilities:
- Understand the data needs of different engineering teams at NextSense, including HWE, Clinical and Research
- Understand end to end data interactions and dependencies across our existing data pipelines and data transformation and how they impact business decisions.
- Design best practices for big data processing, data modeling and warehouse development for NextSense.
- Build integration and automation across different systems to scale data operations and increase efficiency.
Qualifications
- 3+ years in building a data warehouse and data pipelines. Or, 5+ years in data intensive engineering roles.
- Solid understanding of databases (relational, key/value, document, columnar, olap, graph)
- Strong SWE principles and OO experience
- Experience with GCP, Kubernetes
- Experience building and running in production (24x7 environments) distributed large-scale systems
- Experience in working on Cloud Distributed Storage/Databases and/or data technologies that power analytics (e.g., Pinot, Druid, Redshift, Hadoop, Spark, Presto, Kafka, Flink, etc. or similar technologies)
- Experience in Building Micro services & Cloud Platforms on AWS, Azure etc
- Experience with open source project management and governance
- Hands on experience developing distributed systems, databases, or other large scale data systems.
See more jobs at NextSense
Apply for this job