Company:
Zenith Services Inc.
Location: Plano
Closing Date: 05/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Duties and responsibilities
- Collaborate with the team to build out features for the data platform and consolidate data
- assets
- Build, maintain and optimize data pipelines built using Spark
- Advise, consult, and coach other data professionals on standards and practices
- Work with the team to define company data assets
- Partner with business analysts and solutions architects to develop technical
- architectures for strategic enterprise projects and initiatives
- Build libraries to standardize how we process data
- Loves to teach and learn, and knows that continuous learning is the cornerstone of every
- successful engineer
- Has a solid understanding of AWS tools such as EMR or Glue, their pros and cons and
- is able to intelligently convey such knowledge
- Implement automation on applicable processes
Mandatory Skills:
- 5+ years of experience in a data engineering position
- Proficiency is Python (or similar) and SQL
- Strong experience building data pipelines with Spark
- Strong verbal & written communication
- Strong analytical and problem solving skills
- Experience with relational datastores, NoSQL datastores and cloud object stores
- Experience building data processing infrastructure in AWS
- Bonus: Experience with infrastructure as code solutions, preferably Terraform
- Bonus: Cloud certification
- Bonus: Production experience with ACID compliant formats such as Hudi, Iceberg or
- Delta Lake
- Bonus: Familiar with data observability solutions, data governance frameworks
- Requirements
- Bachelor’s Degree in Computer Science/Programming or similar is preferred
- Right to work
- Must have legal right to work in the USA
Job responsibilities:
- Your experience in public cloud migrations of complex systems, anticipating problems, and finding ways to mitigate risk, will be key in leading numerous public cloud initiatives
- Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Own end-to-end platform issues & help provide solutions to platform build and performance issues on the AWS Cloud & ensure the deliverables are bug free
- Drive, support, and deliver on a strategy to build broad use of Amazon's utility computing web services (e.g., AWS EC2, AWS S3, AWS RDS, AWS CloudFront, AWS EFS, AWS DynamoDB, CloudWatch, EKS, ECS, MFTS, ALB, NLB)
- Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating to continually improve
- Provide primary operational support and engineering for the public cloud platform and debug and optimize systems and automate routine tasks
- Collaborate with a cross-functional team to develop real-world solutions and positive user experiences at every interaction
- Drive Game days, Resiliency tests and Chaos engineering exercises
- Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases, Container Orchestration services including Docker and Kubernetes, and a variety of AWS tools and services
- Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 10+ years applied experience
- Hands-on practical experience delivering system design, application development, testing, and operational stability
- Advanced in one or more programming language(s) - Java, Python, Go
- A strong understanding of business technology drivers and their impact on architecture design, performance and monitoring, best practices
- Design and building web environments on AWS, which includes working with services like EC2, ALB, NLB, Aurora Postgres, DynamoDB, EKS, ECS fargate, MFTS, SQS/SNS, S3 and Route53
- Advanced in modern technologies such as: Java version 8+, Spring Boot, Restful Microservices, AWS or Cloud Foundry, Kubernetes.
- Experience using DevOps tools in a cloud environment, such as Ansible, Artifactory, Docker, GitHub, Jenkins, Kubernetes, Maven, and Sonar Qube
- Experience and knowledge of writing Infrastructure-as-Code (IaC) and Environment-as-Code (EaC), using tools like CloudFormation or Terraform
- Experience with high volume, SLA critical applications, and building upon messaging and or event-driven architectures
- Deep understanding of financial industry and their IT systems
- Preferred qualifications, capabilities, and skills
- Expert in one or more programming language(s) preferably Java
- AWS Associate level certification in Developer, Solutions Architect or DevOps
- Experience in building the AWS infrastructure like EKS, EC2, ECS, S3, DynamoDB, RDS, MFTS, Route53, ALB, NLB
- Experience with high volume, mission critical applications, and building upon messaging and or event-driven architectures using Apache Kafka
- Experience with logging, observability and monitoring tools including Splunk, Datadog, Dynatrace. CloudWatch or Grafana
- Experience in automation and continuous delivery methods using Shell scripts, Gradle, Maven, Jenkins, Spinnaker
- Experience with microservices architecture, high volume, SLA critical applications and their interdependencies with other applications, microservices and databases
- Experience developing process, tooling, and methods to help improve operational maturity"
Share this job
Zenith Services Inc.
Useful Links