Company:
ApTask
Location: Plano
Closing Date: 04/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
About Client:
The client provides information technology (IT) services, including business outsourcing, infrastructure technology, and application services. The application service offered by the company includes application development, maintenance, and support. The markets served by the company are financial services and insurance, healthcare, manufacturing, government, transportation, communications, and consumer and retail industries.
Rate Range: $65- $72/hr C2C Without Benifits
Salary: $130k per annum + benefits
Job Description:
Collaborate with the team to build out features for the data platform and consolidate data
assets
Build, maintain and optimize data pipelines built using Spark
Advise, consult, and coach other data professionals on standards and practices
Work with the team to define company data assets
Migrate CMS’ data platform into Chase’s environment
Partner with business analysts and solutions architects to develop technical
architectures for strategic enterprise projects and initiatives
Build libraries to standardize how we process data
Loves to teach and learn, and knows that continuous learning is the cornerstone of every
successful engineer
Has a solid understanding of AWS tools such as EMR or Glue, their pros and cons and
is able to intelligently convey such knowledge
Implement automation on applicable processes
Mandatory Skills:
5+ Years experience in a data engineering position
Proficiency is Python (or similar) and SQL
Strong experience building data pipelines with Spark
Experience with relational datastores, NoSQL datastores and cloud object stores
Experience building data processing infrastructure in AWS
Bonus: Experience with infrastructure as code solutions, preferably Terraform
Bonus: Cloud certification
Bonus: Production experience with ACID compliant formats such as Hudi, Iceberg or
Delta Lake
Bonus: Familiar with data observability solutions, data governance frameworks
Responsibilities:
Your experience in public cloud migrations of complex systems, anticipating problems, and finding ways to mitigate risk, will be key in leading numerous public cloud initiatives
Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
Own end-to-end platform issues & help provide solutions to platform build and performance issues on the AWS Cloud & ensure the deliverables are bug free
Drive, support, and deliver on a strategy to build broad use of Amazon's utility computing web services (e.g., AWS EC2, AWS S3, AWS RDS, AWS CloudFront, AWS EFS, AWS DynamoDB, CloudWatch, EKS, ECS, MFTS, ALB, NLB)
Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating to continually improve
Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases, Container Orchestration services including Docker and Kubernetes, and a variety of AWS tools and services
Required qualifications, capabilities, and skills:
Formal training or certification on software engineering concepts and 10+ years applied experience
Hands-on practical experience delivering system design, application development, testing, and operational stability
Advanced in one or more programming language(s) - Java, Python, Go
A strong understanding of business technology drivers and their impact on architecture design, performance and monitoring, best practices
Design and building web environments on AWS, which includes working with services like EC2, ALB, NLB, Aurora Postgres, DynamoDB, EKS, ECS fargate, MFTS, SQS/SNS, S3 and Route53
Advanced in modern technologies such as: Java version 8+, Spring Boot, Restful Microservices, AWS or Cloud Foundry, Kubernetes.
Experience using DevOps tools in a cloud environment, such as Ansible, Artifactory, Docker, GitHub, Jenkins, Kubernetes, Maven, and Sonar Qube
Experience and knowledge of writing Infrastructure-as-Code (IaC) and Environment-as-Code (EaC), using tools like CloudFormation or Terraform
Preferred qualifications, capabilities, and skills
Expert in one or more programming language(s) preferably Java
AWS Associate level certification in Developer, Solutions Architect or DevOps
Experience in building the AWS infrastructure like EKS, EC2, ECS, S3, DynamoDB, RDS, MFTS, Route53, ALB, NLB
Experience with high volume, mission critical applications, and building upon messaging and or event-driven architectures using Apache Kafka
Experience with logging, observability and monitoring tools including Splunk, Datadog, Dynatrace. CloudWatch or Grafana
Experience in automation and continuous delivery methods using Shell scripts, Gradle, Maven, Jenkins, Spinnaker
About ApTask:
ApTask is a leading global provider of workforce solutions and talent acquisition services, dedicated to shaping the future of work. As an African American-owned and Veteran-certified company, ApTask offers a comprehensive suite of services, including staffing and recruitment solutions, managed services, IT consulting, and project management. With a focus on excellence, collaboration, and innovation, ApTask provides unparalleled opportunities for professional growth and development. As a member of the ApTask team, you will have the chance to connect businesses with top-tier professionals, optimize workforce performance, and drive success across diverse industries. Join us at ApTask and be part of our mission to empower organizations to thrive while fostering a diverse and inclusive work environment.
Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government issued ID during each interview.
Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.
If you have any concerns or queries about your personal information, please feel free to contact our compliance team at .
The client provides information technology (IT) services, including business outsourcing, infrastructure technology, and application services. The application service offered by the company includes application development, maintenance, and support. The markets served by the company are financial services and insurance, healthcare, manufacturing, government, transportation, communications, and consumer and retail industries.
Rate Range: $65- $72/hr C2C Without Benifits
Salary: $130k per annum + benefits
Job Description:
Collaborate with the team to build out features for the data platform and consolidate data
assets
Build, maintain and optimize data pipelines built using Spark
Advise, consult, and coach other data professionals on standards and practices
Work with the team to define company data assets
Migrate CMS’ data platform into Chase’s environment
Partner with business analysts and solutions architects to develop technical
architectures for strategic enterprise projects and initiatives
Build libraries to standardize how we process data
Loves to teach and learn, and knows that continuous learning is the cornerstone of every
successful engineer
Has a solid understanding of AWS tools such as EMR or Glue, their pros and cons and
is able to intelligently convey such knowledge
Implement automation on applicable processes
Mandatory Skills:
5+ Years experience in a data engineering position
Proficiency is Python (or similar) and SQL
Strong experience building data pipelines with Spark
Experience with relational datastores, NoSQL datastores and cloud object stores
Experience building data processing infrastructure in AWS
Bonus: Experience with infrastructure as code solutions, preferably Terraform
Bonus: Cloud certification
Bonus: Production experience with ACID compliant formats such as Hudi, Iceberg or
Delta Lake
Bonus: Familiar with data observability solutions, data governance frameworks
Responsibilities:
Your experience in public cloud migrations of complex systems, anticipating problems, and finding ways to mitigate risk, will be key in leading numerous public cloud initiatives
Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
Own end-to-end platform issues & help provide solutions to platform build and performance issues on the AWS Cloud & ensure the deliverables are bug free
Drive, support, and deliver on a strategy to build broad use of Amazon's utility computing web services (e.g., AWS EC2, AWS S3, AWS RDS, AWS CloudFront, AWS EFS, AWS DynamoDB, CloudWatch, EKS, ECS, MFTS, ALB, NLB)
Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating to continually improve
Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases, Container Orchestration services including Docker and Kubernetes, and a variety of AWS tools and services
Required qualifications, capabilities, and skills:
Formal training or certification on software engineering concepts and 10+ years applied experience
Hands-on practical experience delivering system design, application development, testing, and operational stability
Advanced in one or more programming language(s) - Java, Python, Go
A strong understanding of business technology drivers and their impact on architecture design, performance and monitoring, best practices
Design and building web environments on AWS, which includes working with services like EC2, ALB, NLB, Aurora Postgres, DynamoDB, EKS, ECS fargate, MFTS, SQS/SNS, S3 and Route53
Advanced in modern technologies such as: Java version 8+, Spring Boot, Restful Microservices, AWS or Cloud Foundry, Kubernetes.
Experience using DevOps tools in a cloud environment, such as Ansible, Artifactory, Docker, GitHub, Jenkins, Kubernetes, Maven, and Sonar Qube
Experience and knowledge of writing Infrastructure-as-Code (IaC) and Environment-as-Code (EaC), using tools like CloudFormation or Terraform
Preferred qualifications, capabilities, and skills
Expert in one or more programming language(s) preferably Java
AWS Associate level certification in Developer, Solutions Architect or DevOps
Experience in building the AWS infrastructure like EKS, EC2, ECS, S3, DynamoDB, RDS, MFTS, Route53, ALB, NLB
Experience with high volume, mission critical applications, and building upon messaging and or event-driven architectures using Apache Kafka
Experience with logging, observability and monitoring tools including Splunk, Datadog, Dynatrace. CloudWatch or Grafana
Experience in automation and continuous delivery methods using Shell scripts, Gradle, Maven, Jenkins, Spinnaker
About ApTask:
ApTask is a leading global provider of workforce solutions and talent acquisition services, dedicated to shaping the future of work. As an African American-owned and Veteran-certified company, ApTask offers a comprehensive suite of services, including staffing and recruitment solutions, managed services, IT consulting, and project management. With a focus on excellence, collaboration, and innovation, ApTask provides unparalleled opportunities for professional growth and development. As a member of the ApTask team, you will have the chance to connect businesses with top-tier professionals, optimize workforce performance, and drive success across diverse industries. Join us at ApTask and be part of our mission to empower organizations to thrive while fostering a diverse and inclusive work environment.
Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government issued ID during each interview.
Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.
If you have any concerns or queries about your personal information, please feel free to contact our compliance team at .
Share this job
ApTask
Useful Links