Company:
Serigor Inc
Location: Washington
Closing Date: 20/10/2024
Salary: £125 - £150 Per Annum
Hours: Full Time
Type: Permanent
Job Requirements / Description
#J-18808-Ljbffr
Job Title: ETL Data Engineer (ONSITE)
Location: Washington, DC
Duration: 12 Months+
The Enterprise Data team at client requires an ETL data engineer to support data operations for its Cloud Data Exchange. The resource will utilize native Azure tools to perform ETL, data loading, and data transformation tasks.
The ETL data engineer will support the client Enterprise Data team for data curation, processing, and transformation tasks. Specifically, the ETL data engineer will be responsible for the following tasks:
Responsibilities:
- Analyzes, designs, develops, implements, replicates, and supports complex enterprise data projects.
- Interfaces with other agencies, consults with and informs user departments on system requirements, advises on environment constraints and operating difficulties for the current state, and resolves problems using cloud solutions.
- Strong knowledge of Extraction, Transformation and Loading (ETL) processes using frameworks like Azure Data Factory, Synapse, Databricks, Informatica.
- Establishing the cloud and on-premise connectivity in different systems like ADLS, ADF, Synapse, Databricks.
- Hands-on experience in Azure cloud services like Azure Data Factory or Azure Synapse, MSSQL Db, Azure SQL DB, Azure Data Lake Storage Gen2, Blob Storage, Python etc.
- Worked on creating end-to-end pipelines to load data from multiple sources to landing layers or SQL tables.
- Familiarity/experience with data integration and data pipeline tools (e.g., Informatica, Synapse, Apache NiFi, Apache Airflow).
- Familiarity/experience with various data formats including database specific (Oracle, SQL Server, DB2, Quickbase), text formats (CSV, XML) and Binary (Parquet, AVRO).
- Develops, standardizes, and optimizes existing data workflow/pipelines adhering to best practices.
- Adhere and contribute to enterprise data governance standards by ensuring data accuracy, consistency, security, and reliability.
- Automates, monitors, alerts, and manages data pipelines and workflows.
- Analyzes and evaluates system changes to determine feasibility, provides alternative solutions, back-up, and rollback procedures.
- Works on the development of new systems, upgrades, and enhancements to existing systems.
- Develops complex programs and reports in database query language.
- Familiarity/experience with data visualization tools.
- Familiarity/experience handling and securing sensitive data.
Minimum Education/Certification Requirements:
- Bachelor’s degree in Information Technology or related field or equivalent experience.
Skills:
Skills | Required / Desired | Amount | of Experience |
Strong knowledge for development of Extract-Transform-Load (ETL) processes, including end-to-end pipelines with data loading from multiple sources. | Required | 15 | Years |
Ability to gather and document requirements for data extraction, transformation, and load processes. | Required | 15 | Years |
Understanding of data warehousing, data lake, business intelligence and information management concepts and standards. | Required | 15 | Years |
Ability to advise internal and external customers on appropriate tools and systems for complex data processing challenges. | Required | 15 | Years |
Knowledge and use of SQL for relational databases. | Required | 11 | Years |
Experience with various data formats including database specific (Oracle, SQL, Postgres, DB2), text formats (CSV, XML) and Binary (Parquet, AVRO). | Required | 11 | Years |
Contribute to enterprise data governance standards by ensuring accuracy, consistency, security, and reliability. | Required | 7 | Years |
Strong experience with Microsoft Azure tools such as Azure Data Factory, Synapse, SQL Database, Data Lake Storage Gen2, Blob Storage. | Required | 5 | Years |
Experience with data integration and data pipeline tools such as Informatica PowerCenter, Apache NiFi, Apache Airflow and FME. | Required | 5 | Years |
Experience with visualization and reporting software such as MicroStrategy, Tableau, and Esri ArcGIS. | Highly desired | 3 | Years |
Strong communication skills, both oral and written. | Required | 3 | Years |
Ability to provide excellent customer service to external partners. | Required | 3 | Years |
Ability to work independently or as part of a larger team. | Required | 3 | Years |
Experience performing data functions with Databricks. | Highly desired | 3 | Years |
Share this job
Serigor Inc
Useful Links