Staff Software Engineer - Snowflake

Apply now »

Date: Feb 9, 2024

Location: Pune, MH, IN

Company: Houghton Mifflin Harcourt

HMH Software Engineering

HMH Software Engineering provides cutting edge, individualized learning experiences to millions of students across the United States.  We are as driven by this mission as we are by continuously improving ourselves and the way we work. Our offices are high energy, collaborative beehives of activity where work is centered on small, autonomous teams that build great software.  We trust each other, hold ourselves and our teammates accountable for results, and improve student outcomes with each release.

At HMH we constantly experiment with new approaches and novel ways of solving problems.  We often succeed and sometimes stumble — either way we learn and move forward with more confidence than we had the day before.  We are as passionate about new technologies and engineering craftsmanship as we are about transforming the EdTech industry itself.

If this sounds like you let’s talk. 

The Opportunity – Senior Software Snowflake and DBT Developer for HMH Reporting Platform

Senior Software Engineers personify the notion of constant improvement as they work with their team to build software that delivers on our mission to improve student outcomes.  You’re not afraid to try new things even if they don’t work out as expected.  You are independent, self-directed, high energy and as eager to contribute to your team as you are to progress on your own path to software craftsmanship.  You’ll thrive working in a fast-paced, low friction environment where you are exposed to a wide range of cutting-edge technologies.

Reporting Platform:

You will be working on the Reporting Platform that is part of the HMH Educational Online/Digital Learning Platform using cutting-edge technologies. The Reporting team builds highly scalable and available platform. The platform is built using Microservices Architecture, Java microservices backend, REACT JavaScript UI Frontend, REST APIs, AWS RDS Postgres Database, AWS Cloud technologies, AWS Kafka, AWS Kinesis, Spark with Scala, Kubernetes or Mesos orchestration, Apache Airflow scheduler, DataDog for logging/monitoring/alerting, Concourse CI or Jenkins, Maven etc.

 

Responsibilities:

  • Lead the product team(s) for delivering the solutions that enable the products for large- scale data processing, high fault- tolerance, high availability, high scalability, low latency, and optimal performance to assure fulfilling the expectations of the business.
  • Collaborate with the product owners, the product managers, and the lead architects to de- constitute and re- constitute the architecture, components, modules, and interfaces of the assigned product(s).
  • Follow the JIRA- based scrum methods for agile iterations and release cadence.
  • Implement new data model and able to develop DBT module for complex business logic to support REST APIs and batch rollups of reports data for customer organizations.
  • Writing, designing, testing, implementing, and maintaining snowflake module that not only limited to development but data synchronization as well.
  • Resolve performance issues, performance tuning.
  • Use of Apache Airflow or any other scheduler to setup DBT jobs to run automatically.
  • Supporting streaming event processing using.
  • Provide support for systems architecture for Reporting Platform.
  • Setting up Monitor Dashboards and Alerts using DataDog to proactively catch issues.
  • Diagnose and troubleshoot backend data related errors.
  • DevOps knowledge to automate deployments using Jenkins or Concourse.

 

 

Skills & Experience

Successful Candidates must demonstrate an appropriate combination of:

  • 9+ years of experience as a Snowflake Developer, with DBT, creating and supporting commercial data warehouses and data marts.
  • Experience in Java Backend/Microservices Services programing is an added advantage.
  • Strong hands-on knowledge of managing large ware house solution on Snowflake.
  • Cloud technologies such as AWS and Azure 
  • Data Center Operating Technologies such as Apache Mesos, Apache Aurora, and TerraForm and container services such as Docker and Kubernetes is an added plus.
  • Advanced knowledge of database security and performance monitoring standards.
  • Understanding of relational and dimensional data modeling.
  • Shell scripting skills.
  • Knowledge of DataDog for setting up monitoring and alerting dashboards.
  • Working knowledge of Jenkins or Concourse tool for CI/CD.
  • Ability to work independently and in a group to provide sound design and technology leadership.
  • Self-starter attitude with initiative & creativity.
  • Ability to pay attention to details, dealing with interruptions and changing timelines and priorities.
  • Ability to communicate and work effectively with all levels of company.
  • Knowledge of AWS Database Migration Service and Lambda is a plus.

 

 

 

Required Education:

  •  A BS/MS in Computer Science, Computer Engineering, or a STEM field.

Apply now »