Data Engineer/Sr Data Engineer

Company:  Innovative Solutions
Location: Boston
Closing Date: 08/11/2024
Salary: £125 - £150 Per Annum
Hours: Full Time
Type: Permanent
Job Requirements / Description

As a Data Engineer on our Professional Services team, you will be responsible for working with customers, understanding their needs and requirements, and discussing with them the “art of the possible”. You will also design and implement solutions for data warehouses, data lakes, ETL jobs, and data pipelines using AWS Services, as well as designing and implementing AI/ML solutions using AWS or IBM services such as Bedrock and WatsonX.


You will be responsible for consulting with customers to:

  1. Understand their data management strategy
  2. Provide insights into optimizing their data strategy
  3. Architect and Design data management processes
  4. Excite customers on how AI/ML services can enhance their business
  5. Implementing and Deploying machine learning models and enable advanced analytics
  6. Documenting data architectures and data flows
  7. Driving innovation internally and for customers
  8. Contributing to R&D projects which may turn into new service offerings

How you will be successful:

  1. Living and breathing the “cloud-first” approach
  2. Thinking analytically to solve complex business problems
  3. Obsessively delivering amazing customer experiences
  4. Continually tracking new developments in one or more cloud platforms
  5. Building trusting relationships with all team members
  6. Comfortable with pushing boundaries and technical limits
  7. Keeping up to date on industry trends
  8. Always be learning

We are hiring for two engineers. To qualify for the Mid-level range:

  1. Able to modify and improve existing data sets and structures
  2. 5+ years professional service experience, with customer-facing responsibilities
  3. 2+ years professional AWS and/or WatsonX experience
  4. At least (1) AWS or Google Certification
  5. Proficient in one or more of these languages: Python, R, Java, Scala
  6. Experience in SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra etc. to store and query large datasets
  7. Data modeling - ability to design conceptual, logical, and physical data models
  8. ETL (Extract, Transform, Load) tools like Informatica, Talend, Pentaho etc. to integrate and move data between systems
  9. Big data frameworks like Hadoop, Spark, or Kafka for distributed data processing and building data lakes
  10. Machine learning frameworks like Tensorflow, PyTorch, Keras, Scikit-Learn for building ML models
  11. Business Intelligence experience with Power BI and/or QuickSight is a great addition
  12. Understanding of dimensional modeling, star schemas, data warehouses
  13. Knowledge of data architecture patterns like lambda, kappa architecture
  14. Ability to design scalable and flexible data pipelines
  15. Experience working within standard agile methodologies

To qualify for the Sr-level range:

  1. 5+ years consulting/professional service experience, with customer-facing responsibilities
  2. 4+ years professional AWS and/or WatsonX experience
  3. At least (1) AWS Professional Level Certification
  4. Proficient in one or more of these languages: Python, R, Java, Scala
  5. Experience in designing and building data pipelines
  6. Creating machine learning models or using LLMs
  7. SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra to store and query large datasets
  8. Data modeling - ability to design conceptual, logical, and physical data models
  9. Dimensional modeling, star schemas, data warehouses
  10. ETL (Extract, Transform, Load) tools like Glue, Informatica, Talend, Pentaho etc. to integrate and move data between systems
  11. Big data frameworks like Hadoop, Spark, or Kafka for distributed data processing and building data lakes
  12. Machine learning frameworks like Tensorflow, PyTorch, Keras, Scikit-Learn for building ML models
  13. Business Intelligence experience with Power BI and/or QuickSight is a great addition
  14. Knowledge of data architecture patterns like lambda, kappa architecture
  15. Ability to design scalable and flexible data pipelines
  16. Experience working within standard agile methodologies

The salary range provided is a general guideline. When extending an offer, Innovative considers factors including, but not limited to, the responsibilities of the specific role, market conditions, geographic location, as well as the candidate’s professional experience, key skills, and education/training.

#J-18808-Ljbffr
Apply Now
Share this job
Innovative Solutions
An error has occurred. This application may no longer respond until reloaded. Reload 🗙