WebstaurantStore

Senior Cloud Data Engineer / Modern Data Stack Focus

Apply Now

Job Summary

The Enterprise BI & Analytics Team serves as the enterprise data and analytics center of excellence at WebstaurantStore. We transform billions of data points across marketing, supply chain, operations, and customer experience into trusted data foundations and actionable insights. Leveraging a modern cloud data platform, scalable ELT architecture, and automated DataOps workflows, the team empowers smarter decisions, rapid growth, and new opportunities across the business. 

As a Senior Cloud Data Engineer, you will architect and own the next generation of our modern data ingestion and integration ecosystem. You will lead the development of a unified ingestion framework into Snowflake—building scalable, automated, and high-throughput data pipelines that power analytics and data products enterprise-wide. You will play a key role in our data warehouse modernization initiative, guide our transition from SQL Server to Snowflake, mentor engineers, and help define the future state of our cloud data architecture. 

Job Location

Remote: Legal residents of one of the following states: AK, AL, AR, AZ, CT, DE, FL, GA, IA, ID, IN, KS, KY, LA, MD, ME, MI, MN, MO, MS, NC, ND, NH, NM, NV, OH, OK, PA, SC, SD, TN, TX, UT, VA, VT, WI, WV, or WY

We only accept W-2 candidates, H-1B sponsorship is not available.

Responsibilities

Senior Cloud Data Engineers are hands-on technical leaders and trusted partners to BI developers, analytics engineers, and cross-functional teams. You will drive innovation across our ingestion layer and enable scalable, reliable, and governed data movement throughout the enterprise. 

Core Responsibilities 

Modern Data Engineering & ELT Architecture 

  • Lead the design and implementation of a standardized ELT-based ingestion framework into Snowflake. 
  • Design scalable ingestion pipelines for structured and semi-structured data from APIs, databases, cloud storage, flat files, and collaboration platforms. 
  • Build and maintain Fabric Pipelines to support cloud-native orchestration and pipeline automation. 
  • Develop reusable Python libraries and ingestion accelerators to standardize engineering practices. 
  • Implement robust monitoring, logging, and data reliability engineering practices to ensure SLA adherence. 
  • Refactor and optimize legacy ingestion processes to align with modern data stack best practices. 
  • Drive and support the migration from SQL Server to Snowflake. 

Technical Leadership 

  • Act as the lead architect for enterprise ingestion strategy, mentoring engineers and guiding technical design decisions. 
  • Champion best practices in CI/CD, version control, automated testing, and infrastructure-as-code principles within the data engineering function. 
  • Perform code reviews and enforce architectural standards. 
  • Partner with BI leadership to plan capacity, prioritize initiatives, and execute delivery roadmaps. 
  • Foster a culture of engineering ownership, accountability, and innovation. 

Strategy & Execution 

  • Collaborate with business and analytics stakeholders to translate requirements into scalable ingestion designs. 
  • Evaluate emerging tools and technologies within the modern data stack ecosystem. 
  • Contribute to long-term cloud data strategy and architectural evolution. 
  • Ensure enterprise-grade governance, security, compliance, and data quality controls. 

Physical Requirements

  • Work is performed while sitting/standing and interfacing with a personal computer.
  • Requires the ability to communicate effectively using speech, vision, and hearing.
  • Requires the regular use of hands for simple grasping and fine manipulations.
  • Requires occasional bending, squatting, crawling, climbing, and reaching.
  • Requires the ability to occasionally lift, carry, push, or pull medium weights, up to 50lbs.

Qualifications

Experience

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related discipline; or equivalent experience. 
  • 7–10 years of experience in data engineering or cloud data development roles. 
  • 5+ years of hands-on Snowflake experience, including ingestion architecture, performance optimization, and workload management. 
  • 4–6 years building pipelines using Microsoft Fabric Pipelines or comparable orchestration frameworks. 
  • 4–6 years of Python development for data automation and pipeline engineering. 
  • 5+ years of advanced SQL experience, including deep expertise in SQL Server and CDC methodologies. 
  • 5+ years operating in Agile/Scrum delivery environments. 
  • 3+ years in a technical leadership capacity within data engineering initiatives. 
  • Experience with event-based ingestion, streaming pipelines, or data contract–driven architectures. 
  • Experience with Qlik Replicate and dbt Cloud. 
  • Microsoft Azure experience. 
  • Relevant cloud or data platform certifications preferred. 

Education

This role does not require a degree. We value relevant skills and experience and alignment with our core values above all else.

Desired Traits & Skills

  • Expertise in cloud data warehouse architecture and scalable ELT frameworks. 
  • Strong command of Snowflake internals, optimization strategies, and ingestion design patterns. 
  • Advanced Python and SQL proficiency. 
  • Experience implementing data observability and pipeline reliability solutions. 
  • Deep understanding of dimensional modeling and enterprise data warehousing concepts. 
  • Proven ability to lead complex modernization initiatives. 
  • Strong stakeholder engagement and technical communication skills. 
  • Ability to thrive in a fast-paced, growth-oriented environment. 
  • Passion for building reliable, scalable, and future-ready data platforms. 
Apply Now
Job Posted: 04/27/2026