Apex Systems is looking for candidates for a Sr Data Analyst/Data Engineer position for one of our biggest financial clients in Charlotte, NC. If you are interested, please send your resume to
Location: Addison, TX | Pennington, NJ | Charlotte, NC | Plano, TX - must be open to hybrid onsite work at least 3x/wk
Type: 4+ month contract with potential to extend
We are only able to work on a W2 basis, no Corp to Corp. Must be able to work on our W2 without sponsorship!
Required Skills:
- 5+ years in a related Data occupation with Banking & Markets business acumen
- Strong big data engineering and ETL development background
- Working closely with business partners to help translate functional requirements into technical approach, design, and decisions
- Strong SQL - creating schema objects, complex attributes / metrics, conditional and level metrics, and their use within a report
- Developing Excel & Tableau dashboards and 3rd party application report integration
- ETL solution design
- Using Python, Hadoop/Hive and data warehouse concepts /architecture, and dimensional modeling
- Tuning and optimizing query performance for large datasets-cubes, caching, aggregate structures within MicroStrategy, Tableau and various RDBMS, Hadoop backend systems.
Position Description:
This role is responsible for leading the transformation of the enterprise ETL platforms which requires engagement across multiple line of business. Key responsibilities include setting up automations, coordinating delivery, providing visibility of program health, and managing program risks. This role facilitates sync points between business and technology leaders across, as well as Risk and Compliance partners. Individuals in the role also ensure delivery meets the client’s expectations in terms of the target outcomes, timeline and cost.
- Develop and deliver data ETL solutions to accomplish technology and business goals.
- Code solutions to ingest, curate, aggregate, integrate, clean, transform, and control data in operational and/or analytics data systems per the defined acceptance criteria.
- Assemble large, complex data sets to meet functional reporting requirements.
- Build processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management.
- Define and build reporting applications that enable better data-informed decision-making.
- Contribute to existing test suites (integration, regression, performance), analyze test reports, identify any test issues/errors, and triage the underlying cause.
- Develop, design, and document ETL Process based on established standards and guidelines and CI/CD toolset such as JIRA, Confluence, BitBucket, Jenkins.
- Automate ETL processes leveraging enterprise tools such as AutoSyst.
- Provide operational and escalation support when required.
- Document and communicate required information for deployment, maintenance, support, and business functionality.
- Work closely with business partners to help translate functional requirements into technical approach, design, and decisions.
- Create SQL schema objects, complex attributes / metrics, conditional and level metrics, and their use within a report.
- Using SQL, Python, Hadoop/Hive and data warehouse concepts /architecture, dimensional modeling, and ETL solution design.
- Tune and optimize query performance for large datasets-cubes, caching, aggregate structures within MicroStrategy, Tableau and various RDBMS, Hadoop backend systems.