At Flowhub, we're on a mission to make legal cannabis accessible to everyone. Founded in 2015, Flowhub pioneered the first Metrc API integration to help dispensaries stay compliant. Today, over 1,000 dispensaries trust Flowhub's point of sale, inventory management, business intelligence, and mobile solutions to process $3B+ cannabis sales annually.
Flowhub creates user-friendly business management and compliance products that increase revenue in the highly regulated cannabis industry. Our Engineering department is highly creative, incredibly resourceful, and obsesses over the user experience. We’re currently looking to add to our engineering team by hiring a Senior Data/DevOps Engineer.
Data and DevOps Engineers at Flowhub use their expertise to make sure our data systems are built to scale and efficient in their performance, but they also have an interest in making our other engineering tools performant, secure, and easy to use. They look for opportunities to optimize everywhere - cost, security, performance, efficiency. Flowhub engineers are pragmatic thinkers - they choose the right technology to solve the problems we’re facing. Our Data and DevOps engineers take pride in delivering a high-availability, performant product to the businesses we serve.
Who you are:
- An innately curious person who loves asking questions to understand better how people, systems, and businesses work
- Experienced in system design and architecture decisions, and are excited to work with the team around you and share your expertise
- Pragmatic and flexible, want to solve problems without over-engineering
- Data-driven in all your decision-making and anxious to test your hypotheses before over-investing
- Passionate about your customers and coworkers, and work tirelessly to improve their lives
- Comfortable working in a remote environment
- Being comfortable working with a team where the focus is product/value delivery
- Deep experience with Postgres database systems setup and infrastructure
- Experienced in managing resources in GCP
- Some experience with Clickhouse, Elasticsearch, Kubernetes (even better GKE), and Gitlab
What you’ll do:
- Optimize our data offerings by reconfiguring database system settings, managing load, rewriting queries and indexes, and introducing archival and sharding strategies
- Understand the underlying ETL and setup of our Clickhouse and Elasticsearch data systems, ready to jump in and fix any problems that arise without notice
- Improve developer efficiency and reduce toil by building, supporting, and improving the tooling they use every day
- Help to identify and fill any critical gaps in our data application architecture or our security posture
- Become familiar with our CI/CD stack (Gitlab), pass on that understanding to other teams, and identify and implement improvements to our release process
- Work with the engineering team to reduce failures flowing into production to improve velocity and stability, improving our database deployment strategy
- Use IAC Tools (primarily Terraform) to maintain system state and make changes
- Implement security best practices across our infrastructure
Nice to have:
- Experience with data tools like Clickhouse and Elasticsearch
- Experience with Authorization Tools like (Auth0)
- Experience working in NodeJS and/or Golang
- Working knowledge of rest of Elastic stack (File beat, Metric Beat, APM)
This role is open to anyone within the United States, except candidates in CA , with compensation that aligns with your location. Starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is eligible for a competitive benefits package that includes medical, dental, vision, life, and disability insurance, a 401(k) retirement plan, paid holidays, unlimited paid time off, and other benefits.
Base Salary $140K – $165K + Equity
Compensation Range: $140K - $165K
#J-18808-Ljbffr