Job Description
We are currently looking for an IT Lead ETL Engineer to join our growing team! In this role, you will develop modern data pipelines to meet key business objectives. We are looking for enthusiastic, motivated ETL engineer. You will display proven ability to organize, prioritize task, resolve technical issues, proactive thinking. You will have extensive experience building scalable data platform.
The Data Warehouse team uses a wide range of technical skills, including ETL tools like Informatica, Python, HADOOP/HIVE, Spark, BigQuery, Snowflake.
As an IT Lead ETL Engineer, your duties and responsibilities will include:
- Design and implement ETL processes and controls to ingest, organize, curate large volumes of data from variety of sources
- Design ETL solutions for Data Warehouse/Data Mart/Data Lake
- Build streaming and batch data pipelines
- Work closely with cross-functional teams in IT and business partner to build ETL solutions
- Implement ETL solutions for large volume of data by applying various data warehouse modeling techniques like Star Schema
- Build reusable code, components and services that incorporate versioning, reconciliation, and exception handling
- Communicate effectively with technical and non-technical stakeholders
Qualifications
WHAT IT TAKES TO CATCH OUR EYE:
- Bachelor’s Degree in Computer Science or relevant field plus 8+ years of progressive experience in designing, developing, and deploying Date warehouse and Big Data solutions
- 5+ years of experience designing, building, and supporting real-time and batch data pipeline and analytical solutions with Hadoop (Hive, Kafka, Spark, Spark Streaming, HBase) and Teradata, ETL tools like Informatica, Talend, Data Fusion, Python/Java, Oracle/Exadata
- 1+ years of experience in delivering data solutions on cloud platforms (GCP or Azure)
- Exceptional communication and stakeholder management skills, with the ability to drive actionable insights that facilitate cross-functional initiatives and business outcomes
- Excellent planning skills, including the ability to organize, prioritize and control job responsibilities to meet deadlines in an environment with overlapping and potentially conflicting priorities
- Analytical, troubleshooting, and problem-solving abilities
- Experience Hadoop cluster creation and management
- Ability to expertly solve complex technical problems
- Expert knowledge of non-functional requirements including security, scalability, and usability
- Expert knowledge of data design patterns, principles, and practices
- Demonstrated ability to develop and support a scalable data platform
- Comfortable working in a dynamic, start-up culture
- Telecom experience desired
#LI-SS1
See more jobs at Brightspeed
Apply for this job