Company:
Bayone
Location: Pleasanton
Closing Date: 07/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Key Responsibilities:
- ssist in the design, development, and maintenance of ETL (Extract, Transform, Load) pipelines to move data from various sources into data warehouses and databases.
- Collaborate with senior data engineers to improve and optimize existing data systems and architectures.
- Develop and maintain data models and structures to ensure efficient querying and reporting.
- Troubleshoot data-related issues, including discrepancies, delays, or failures in data pipelines.
- Participate in data cleansing efforts to ensure data quality and integrity.
- Implement data integration processes to support the requirements of business teams and data analysts.
- Work with cloud data platforms (e.g., AWS, Azure, GCP) to deploy data pipelines and manage data storage.
- Help monitor data flows, logs, and system performance to ensure smooth data operations.
- Contribute to the creation of technical documentation, including pipeline workflows and data schemas.
- Support senior team members in ongoing data engineering projects and initiatives.
- Bachelor's degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field.
- Basic knowledge of SQL and experience with relational databases (e.g., MySQL, PostgreSQL, or similar).
- Familiarity with data processing tools (e.g., Apache Spark, Hadoop) and programming languages like Python or Java.
- Exposure to cloud platforms like AWS, Google Cloud, or Azure is a plus.
- Familiarity with version control systems (e.g., Git).
- Basic understanding of ETL concepts and pipeline design.
- Strong problem-solving skills and attention to detail.
- Eagerness to learn new tools and technologies in data engineering.
Share this job
Bayone
Useful Links