Company:
Robert Half
Location: Troy
Closing Date: 15/10/2024
Hours: Full Time
Type: Standard
Job Requirements / Description
Job Summary :
As a Data Engineer , you will play a key role in building a Unified Data Platform that empowers our clients with advanced analytics and business intelligence. You will design, develop, and maintain data pipelines, data lakes, and platforms using cutting-edge technologies like Spark, Kafka, AWS, Azure, and Kubernetes to tackle large-scale data challenges. Collaborating with cross-functional teams, you’ll ensure data quality, reliability, and usability to support data-driven decision-making.
Key Responsibilities :
- Build automated pipelines to extract and process data from legacy systems, primarily using SQL Server and AWS Glue.
- Implement business logic on modern data platforms (AWS Glue, Databricks, Azure) using best practices.
- Create vector databases, data marts, and design data models to support business needs.
- Optimize and monitor the performance, reliability, and security of data systems.
- Integrate and transform data from structured, unstructured, streaming, and batch sources.
- Develop and maintain data quality checks, tests, and comprehensive documentation.
- Support data analysis and visualization efforts using tools such as SQL, Python, Tableau, and Quicksight.
- Stay informed on new data technologies and trends to continuously improve our data platforms.
Qualifications and Skills :
- Bachelor’s degree or higher in Computer Science, Engineering, Mathematics, or a related field.
- 5+ years of experience in data engineering or a similar role (DBA experience is a plus).
- Proficiency with big data frameworks and tools like Spark, Hadoop, Kafka, and Hive.
- Expert knowledge of SQL, including query optimization, schema design, DDL, and stored procedures.
- Proficient in at least one programming language (Python, Go, or Java).
- Experience with CI/CD, containerization (Docker, Kubernetes), and orchestration tools (Airflow).
- Hands-on experience with modern ETL/ELT systems and data platforms (AWS Glue, Databricks, Snowflake, Elastic, Azure Cognitive Search).
- Experience deploying data infrastructure on cloud platforms (AWS, Azure, or GCP).
- Strong understanding of data governance, data security, and data quality principles.
- Excellent communication, collaboration, and problem-solving skills.
Share this job
Robert Half
Useful Links