Company:
Tata Consultancy Services
Location: Atlanta
Closing Date: 28/10/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Job Description: (Must have 8+exp)
- At least 8 years of overall experience in building ETL/ELT, data warehousing and big data solutions.
- At least 5 years of experience in building data models and data pipelines to process different types of large datasets.
- At least 3 years of experience with Python, Spark, Hive, Hadoop, Kinesis, Kafka.
- Proven expertise in relational and dimensional data modeling.
- Understand PII standards, processes, and security protocols.
- Experience in building data warehouse using Cloud Technologies such as AWS or GCP Services, and Cloud Data Warehouse preferably Google BigQuery.
- In depth knowledge in Big Data solutions and Hadoop ecosystem.
- Strong SQL knowledge – able to translate complex scenarios into queries. Strong Programming experience in Java or Python. Experience in Google Cloud platform (especially BigQuery & Dataflow).
- Experience with Google Cloud SDK & API Scripting. Experience in Hadoop (Hive, MapReduce, Spark).
- Experience in Onsite-Offshore coordination. Experience in Test Driven Development. Experience in Agile processes and DevOps methodologies.
- Experience in NoSQL Databases. Experience with Data modeling and mapping. Experience of Retail domain will be an added advantage.
- Google Cloud Professional Data Engineer Certification is an advantage. Programming Language – Java / Python. Google Cloud – Big Query, Pub/ Sub, Dataflow, Composer DAGs, Cloud storage. CI/CD – GitHub, Jenkins.
Required Skills:
- Good organizational and problem-solving skills
- Good team player who is self motivated and well organized
- Strong oral and written communication skills
- Ability to work with remote teams
- Ability to manage project scope
Share this job
Tata Consultancy Services
Useful Links