Dice is the leading career destination for tech experts at every stage of their careers. Our client, Jobot, is seeking the following. Apply via Dice today!
100% REMOTE Senior Data Engineer / Lead Data Platform Engineer Needed for Growing Subsidiary of a Large Public Company!
This Jobot Job is hosted by: Reed Kellick
Are you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.
Salary: $145,000 - $235,000 per year
A bit about us:
We are a growing subsidiary of a large public company that is hiring multiple data engineers! If you are a Lead Data Engineer / Senior Data Infrastructure Engineer, please read on!
Why join us?
As a Lead Data Infrastructure Engineer / Principal Data Engineer in our company, we are able to offer:
- A competitive base salary between $95k and $235k, depending on seniority level!
- Stock grant of $12k to $40k, depending on experience!
- Bonus of 12-20%, depending on seniority!
- Work from home / work remote 100%!
- 401k with dollar for dollar match, up to 6% of eligible earnings (base, bonus). Plus additional company contribution!
- Comprehensive medical, dental, vision and life insurance!
- 17 paid holidays per year, including 3 floating holidays!
- Annual Paid Time Off (PTO), with separate sick days!
- 12 weeks paid Parental Leave!
- Caregiver Leave!
- Adoption and Surrogacy Assistance Plan!
- Flexible workplace accommodation!
- Fun team/company events at Sports games, concerts, etc.!
- Tuition reimbursement!
- Ability to attend conferences!
- A MacBook Pro and accompanying hardware to do great work!
- A modern productivity toolset to get work done: Slack, Miro, Loom, Lucid, Google Docs, Atlassian and more!
- Generous company discounts!
- Eligible for donation matching to over 1.5 million nonprofit organization!
As a Principal Data Platform Engineer / Staff Data Engineer on our team, we are looking for:
- 2+ years of experience as a Data Engineer for a Data Engineer role, 5+ years for a Senior, and 7+ for a Lead
- BS, MS, or PhD in Computer Science, Mathematics, Statistics, Engineering, Operations Research, or other quantitative field
- Experience developing and maintaining data pipelines, infrastructure and architecture
- Experience writing code to extract, process and store data within different types of data stores (Snowflake, Postgres, DynamoDB, Kafka, Graph databases)
- Strong Python skills
- Experience building batch and streaming pipelines using complex SQL, PySpark, Pandas, and similar frameworks
- Experience with some or most of the following would be ideal: Snowflake, Fivetran, DBT Cloud, DataDog, Atlan, Monte Carlo, Airflow (MWAA), EKS (Kubernetes)