Job Description:
Design, develop, and maintain scalable data pipelines for batch and streaming data, handling both structured and unstructured formats effectively.
Utilize advanced analytics tools and methodologies to supplement and enhance the analytics platform, contributing to our journey towards advanced analytical capabilities.
Collaborate with business and technical teams to gather requirements and design comprehensive data solutions that align with organizational goals and DT strategies.
Implement complex data operations using Python, Java, Spark, and SQL, optimizing ETL/ELT processes for performance and efficiency across diverse data types.
Manage data across various platforms, leveraging cloud services and big data technologies.
Skills/Experience:
- Expert Level working experience in ELK or EFK stack (ElasticSearch, Logstash, FluentD, Kibana)
- AWS Managed Kafka or Confluent Cloud Mid to Expert Level
- AWS Cloud Platform Mid level
- Strong background experience in at least one of the SaaS Enterprise Monitoring platforms and technologies, AppDynamics, SignalFx, is preferred
- Advanced Knowledge of enterprise scale Hybrid IT Infrastructure, Platforms and Applications
Works with other Team Leads to ensure that the technical solutions are consistent with the enterprise architecture and strategy.
Must possess strong communication and interpersonal skills, work well in an integrated team environment, and must be self-motivated.
This role requires working outside normal business hours on occasion.
#J-18808-Ljbffr