We are seeking a Snowflake Engineer to join our team as we expand our data architecture and engineering capabilities. This role will focus on creating and maintaining data warehouses, ETL pipelines, and robust data models on Snowflake, with a particular emphasis on AWS cloud infrastructure. These engineers will support multiple projects aimed at leveraging Snowflake and associated cloud technologies for agile data growth and analytics.
As part of our dynamic data engineering team, successful candidates will design and implement scalable data solutions that enhance data flow and accessibility across enterprise systems, contributing to efficient, data-driven decision-making.
Key Responsibilities
- Develop and manage Snowflake data warehouses to support business analytics and data-driven initiatives.
- Design, build, and optimize data ingestion pipelines, utilizing Snowpipe and other Snowflake native tools, to ensure seamless ETL processes.
- Convert data architectures into physical designs, maintaining data structures and processing pipelines for data extraction, transformation, and loading (ETL).
- Collaborate closely with cross-functional teams, including Data Architects, Business Stakeholders, and QA to ensure data quality and optimize ETL processes.
- Consult on project requirements, translating technical and non-functional requirements into efficient data models.
- Utilize tools such as Informatica, DataStage, or other ETL platforms to design source-to-target mappings and optimize data load performance.
- Troubleshoot, tune, and optimize SQL queries and database performance in collaboration with DBA teams.
- Mentor junior team members, providing guidance on Snowflake best practices, SQL optimization, and cloud data solutions.
Requirements
- Experience: 7+ years in data engineering, including experience in enterprise-level data strategy, modeling, and integration.
- Technical Skills:
- Proven expertise in Snowflake data warehousing and AWS cloud solutions.
- Proficiency in data integration, profiling, and data quality management.
- Strong SQL skills with experience in complex query tuning, DDL execution, and database performance optimization.
- Hands-on experience with ETL platforms such as Informatica, DataStage, or similar tools.
- Experience in data modeling for logical, physical, and dimensional models.
- Nice-to-Have:
- Databricks experience and familiarity with machine learning workflows.
Additional Qualifications
- Bachelor’s degree in Computer Science, Data Engineering, or a related field; a Master’s degree is a plus.
- Strong communication skills, with an ability to translate technical concepts for business stakeholders and cross-functional team members.
- Demonstrated adaptability, curiosity for new technology, and collaborative team engagement.
Equal opportunity employer including disability/veterans.
Similar Jobs
- View Job
Java Back End Developer | Java BE developer - ECG/Snowflake Hasura/GraphQL OCM/Java Springboot
Minneapolis - View Job
IT Engineer (Cloud Engineer)
Minneapolis - View Job
Engineer
Minneapolis - View Job
Usability Engineer / Human Factors Engineer
Plymouth - View Job
Software Test Engineer/ QA Engineer
Minneapolis