Company:
Karkidi
Location: Snowflake
Closing Date: 21/10/2024
Salary: £125 - £150 Per Annum
Hours: Full Time
Type: Permanent
Job Requirements / Description
What you will do:
- Design, build, and operate highly performant and scalable batch and stream data processing infrastructure and solutions to support day to day ML operations including training, serving, evaluation and experimental systems.
- Design and develop Moveworks’ foundational data models, data warehouse, real-time and offline processing pipelines using AWS EMR Spark, Apache Kafka, AWS Athena, Snowflake, Airflow, Apache HUDI, etc.
- Closely work with machine learning teams and data science teams to understand their data needs, influence data team’s roadmap, and lead as well as execute on various projects.
- Build data lake and implement data cataloging platform for easy data discovery and availability
- Architect and implement the data anonymization and data access control frameworks that support policy-based masking and access to data for different use cases
- Build out platform and data services/APIs to make data available to various different stakeholders and for customer-facing data products
What you bring to the table:
- 6+ years of experience as a senior/software engineer
- Experience with Python, Golang, Java, or C++
- Experience with cloud infrastructure like AWS, GCP, or Azure
- Experience with relational or non-relational databases such as Postgres, AWS DataLake/S3, DynamoDB, or Snowflake
- BS or higher in Computer Science or a related field.
Share this job
Karkidi
Useful Links