Role Overview

Hashmap advises, architects, selects, and implements solutions for clients so that they may efficiently run their businesses, with a focus on data analytics in the cloud. These service areas span IoT, Solution Architecture, data engineering, analytics, AI, ML, and DevOps.  


We are looking for a savvy Cloud Data Engineer to join our growing Hashmap team of data professionals. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Cloud Data Engineer will support our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be quick learners, self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

Major Duties & Responsibilities:

  • Develop, construct, test and maintain ETL/ELT architectures

  • Work with Architects to ensure development work aligns with business requirements

  • Data acquisition and ingestion for data at rest or streaming data

  • Develop data set processes

  • Use programming language and tools

  • Identify ways to improve data reliability, efficiency, and quality

  • Conduct research for industry and business questions

  • Use large data sets to address business issues

  • Possibly deploy sophisticated analytics programs, machine learning, and statistical methods

  • Prepare data for predictive and prescriptive modeling

  • Find hidden patterns using data

  • Use data to discover tasks that can be automated

  • Deliver updates to stakeholders based on analytics

Other Hashmap Responsibilities: 

  • Must be an excellent communicator in terms of status and raising issues that might impact projects to either project or account management

  • While not engaged, work on internal projects and learning exercises to:

    • Extend your technical knowledge and skills by working with standard stack and new tools

    • Work on internal tech aimed to help build new technical practices

  • Participation is encouraged to assigned training and obtain certifications as identified by the ITT department and/or Hashmap Professional Growth Manager

Skill Sets for Role:

Soft Skills Required: 

  • High level of personal initiative and energy

  • Excellent verbal and written communication skills

  • Capable of working autonomously with minimal oversight

  • Willing to raise issues to Hashmap or client management as required, clearly defining the issue and potential solutions

  • Common sense approach to problems

  • Capable of adapting written processes when required and capturing improvements to the process for later review


Technical Skill Sets (Mandatory):

  • Experience with implementing cloud data warehouses – Snowflake, AWS Redshift, Azure SQL Data Warehouse, or Google BigQuery

  • Experience with building data pipelines - using a variety of different technologies; Azure Data Factory, Matillion, dbt, Python & Spark

    • Candidates must have direct work experience developing and deploying at least one project where they implemented the full cloud data pipeline documented on their resume

  • Strong understanding of SQL (2-4 years)

  • Experience developing with Python (2-3 Years)

  • Need to have a good understanding of Software engineering principles around Object-Oriented Programming & SOLID principles

  • Experience gathering and tracking requirements

  • Experience with DevOps;  ie. (Git +CI/CD + IaC)

  • Experience with Docker containers, including containerizing services and deploying containers in a cloud environment

Knowledge of the following would be an asset: 

  • Open-source contributions of any sort 

  • Experience with BI platforms such as SpotFire, Tableau, PowerBI, Zoomdata, SuperSet, etc.

  • Data cataloging and governance techniques

  • Strong Agile methodology knowledge

Travel Expectations:

  • Please note that due to COVID - work is remote for most of 2021 but when work returns to normal (post-Covid) the Candidates must be available to travel up to two weeks per month regularly, with occasional periods of 2 to 3 weeks per month based on specific client requirements if needed

What We Offer

For the Cloud Engineer role, we’re looking to hire Hashmappers invested in realizing the goal of accelerating high-value business outcomes for our customers, partners, and the community. We offer the following:


  • Competitive salary

  • Incentive or bonus plan

  • Remote work from home from anywhere in the USA

  • 401K Plan - enrollment at the beginning of every quarter and on completing at least 3 months of full-time service

  • Health Insurance - Medical, Dental, Vision (from 1st of the next month following date of hire)

  • Paid Time Off and Holidays

  • Flexible Commute Plan

  • Birth Recovery and Paternal Leave

  • Military Leave

  • Bereavement Leave

  • Exposure to a fast-paced, yet fun culture

  • A chance to work with world-class experts on challenging projects

  • Opportunity to provide meaningful contributions to real solutions that solve problems and are used by our customers

  • High level of access to Hashmap leadership