Work Authorization: US Citizen, Green Card, H-1B, GC-EAD, L2-EAD, TN Visa, E-3 Visa, CPT
US citizens (USC) and Permanent Residents (GC) are encouraged to apply. We are unable to offer any sort of Visa sponsorship for this position.
Local Candidates Preferred. Non-local candidates must be willing to pay for your own interview travel expenses and relocation costs.
Job Description:
Location: Reston, VA
Duration: 9 Months
Rate: $60/hr On C2C (Max)
Interview: F2F must required
Required Skills: ETL, scripting (python or bash), experience in a Cloud/LAMP environment, Linux, API.
Apex Systems on behalf of our client is looking for an ETL Engineer to join their team in their Reston, VA office! The Engineering Analysis ETL (Extract, Transform, Load) and Data Harvest function implements technical ETL designs, follows best practices and standards to collect data from various external data sources and load them into central data stores. This function will enable data analysts to create holistic visualizations, forecasts, and do ad-hoc analysis on the data to allow business groups to get deeper insight into their various products.
We are looking for experience using Python, Perl, Bash, or Ruby for ETL. Experience with a Cloud/LAMP environment is required.
The person in this technical role will be expected to work with technical leads to develop, maintain and run various ETL and data harvest jobs for the team. These include working with data architects to collect the various requirements and implement them using enterprise grade ETL technical solutions. This is a technical role and the person will need to be able to work with different types of data sources and databases, be able to adjust and translate data fields as needed, and be familiar with how to load data into relational databases using scripting languages to provide automation. They must possess a passion for working with data in various formats across different database and data store technologies and frameworks.
Core Responsibilities:
- Participates in projects related to ETL and Data Harvesting.
- Work with data architects to translate data harvesting designs into ETL processes and systems.
- Use SQL to perform computations and data extracts from large databases.
- Create automation using scripting languages to run ETL jobs on a schedule.
- Utilize team best practices to validate that data is loaded correctly and accurately.
- Be able to translate Proof-of-Concept and Prototype ETL scripts into enterprise grade ETL systems.
- Be proficient with various coding standards and best practices related to version control, error logging, and code documentation.
- Some travel.
Tasks:
- Create ETL’s using design and specification within engineering projects and initiatives.
- Uses in-depth technical skills and knowledge to solve difficult development problems and achieve team goals.
- Establishes and maintains best practices for best in class ETL and data harvesting.
- Consistent exercise of independent judgment and discretion in matters of significance.
- Regular consistent and punctual attendance.
- Must be able to work nights and weekends variable schedule(s) as necessary.
- Other duties and responsibilities as assigned.
Qualifications:
- Bachelor’s Degree (Master’s preferred) in engineering, mathematics, computer science, data science; graduate or post-graduate degree a plus.
- A minimum of 5 years of experience working with scripting languages such as Bash, Ruby, Python or NodeJS in an enterprise environment.
- A minimum of 5 years experience working with SQL and with relational database systems such as MySQL, Vertica, Postgres, or Oracle in an enterprise environment.
- A minimum of 5 years experience working with data harvest and ETL systems in an enterprise environment.
- Expert with SQL queries and SQL scripting libraries.
- Knowledge with defining and using ODBC/JDBC connection strings.
- Deep understanding of REST protocol, JSON and XML data formats.
- Proficient with version control systems such as Git.
- Comfortable working in Linux.
- Familiar with various Cloud technologies such as OpenStack, VMWare or AWS desired.
- Ability to translate data formats from various data sources using scripting languages.
- Background in statistical analysis and/or engineering is a huge plus.
- Software Development Lifecycle or Agile/Scrum experience preferred.
- Must be a team-player and be able to work closely with technical teams and data architects.
Equal Opportunity Employer
Cloud Big Data Technologies is an equal opportunity employer inclusive of female, minority, disability and veterans, (M/F/D/V). Hiring, promotion, transfer, compensation, benefits, discipline, termination and all other employment decisions are made without regard to race, color, religion, sex, sexual orientation, gender identity, age, disability, national origin, citizenship/immigration status, veteran status or any other protected status. Cloud Big Data Technologies will not make any posting or employment decision that does not comply with applicable laws relating to labor and employment, equal opportunity, employment eligibility requirements or related matters. Nor will Cloud Big Data Technologies require in a posting or otherwise U.S. citizenship or lawful permanent residency in the U.S. as a condition of employment except as necessary to comply with law, regulation, executive order, or federal, state, or local government contract.
#J-18808-Ljbffr