Blue Origin LLC - Senior Data Engineer
Location: Seattle, Washington
At Blue Origin, we envision millions of people living and working in space for the benefit of Earth. We're working to develop reusable, safe, and low-cost space vehicles and systems within a culture of safety, collaboration, and inclusion. Join our diverse team of problem solvers as we add new chapters to the history of spaceflight!
We are looking for someone to apply their technical expertise, leadership skills, and commitment to quality to positively impact safe human spaceflight. Passion for our mission and vision is required!
This role is part of Enterprise Technology (ET), where we're developing the digital infrastructure needed to build the road to space, with an emphasis on digital capabilities required to advance Blue Origin's mission. Enterprise Technology is the center of excellence for digital technology at Blue Origin, providing oversight and governance to align technology and business strategies.
We are seeking a highly skilled Senior Data Engineer with experience in data platforms like Databricks, Snowflake & Palantir Foundry or other similar data platforms to lead our efforts in ingesting, curating, and creating an ontology/canonical model to drive operational outcomes in aerospace manufacturing. The successful candidate will play a critical role in establishing and maintaining data pipeline standards, data ingestion templates, and low code ETL processes to ensure seamless data integration, federation, and self-service data consumption within the organization.
Key Responsibilities:
- Data Ingestion and Ontology/Canonical Model Creation: Design and develop data ingestion templates and ETL processes for operational & analytical data platforms. Create and maintain an ontology structure tailored to aerospace manufacturing data requirements.
- Metadata Driven Data Pipelining: Build and promote reusable data ingestion and ETL templates to facilitate data publishing and federation across teams in the enterprise.
- Data Application Development and Support: Collaborate with domain consumers/leads to build domain-specific data applications for viewing knowledge graphs, creating and orchestrating events on the Ontology or canonical models.
- Data Reconciliation: Implement solutions for data reconciliation with various data sources to ensure data accuracy and consistency.
- Monitoring and Performance: Develop and maintain monitoring & observability solutions to track data pipeline statistics and ensure data quality. Monitor data platform usage and optimize as needed.
- Data Management Controls: Solution and implement data management controls including data quality, role-based access, data classification & data archiving as part of data ingestion and building the Ontology.
- DataOps: Implement DataOps best practices to streamline data pipeline deployment and management. Develop & maintain monitoring solutions to track data pipeline statistics and enable proactive issue detection and resolution.
- Mentorship and Knowledge Sharing: Provide mentorship to junior data engineers and assist in their skill development.