- Страна
- Канада
Откликайтесь
на вакансии с ИИ

Data Engineering Manager
Позиция в стабильной SaaS-компании с современным стеком технологий (Databricks Lakehouse) и четко определенными задачами. Высокий балл обусловлен лидерским статусом роли и влиянием на архитектуру продукта мирового уровня.
Сложность вакансии
Роль требует глубокой технической экспертизы в экосистеме Databricks и Azure, а также опыта управления распределенными командами. Высокая сложность обусловлена необходимостью проектирования мультиарендных (multi-tenant) систем и обеспечения строгих SLA для клиентских данных.
Анализ зарплаты
Указанная роль менеджера по инженерии данных в Канаде (Миссиссога/Торонто) предполагает конкурентную оплату. Рыночные оценки для этой позиции выше среднего из-за узкой специализации на Databricks и Azure.
Сопроводительное письмо
I am writing to express my strong interest in the Data Engineering Manager position at Upshop. With over 8 years of experience in data engineering and a proven track record of leading technical teams, I am excited about the opportunity to drive the evolution of your Databricks Lakehouse architecture. My expertise in PySpark, Unity Catalog, and multi-tenant Azure environments aligns perfectly with Upshop's mission to provide high-quality, actionable data for retail and supply chain analytics.
In my previous roles, I have successfully implemented scalable ingestion strategies and robust data observability frameworks, similar to those described in your requirements. I am particularly drawn to Upshop's commitment to innovation and excellence, and I am confident that my experience in optimizing FinOps and managing CI/CD pipelines for enterprise-scale platforms will contribute significantly to your team's success. I look forward to the possibility of discussing how my technical leadership can help Upshop continue to deliver mission-critical operations for your global retail partners.
Составьте идеальное письмо к вакансии с ИИ-агентом

Откликнитесь в upshop уже сейчас
Присоединяйтесь к Upshop в качестве Data Engineering Manager и возглавьте развитие передовой платформы Databricks Lakehouse для крупнейших ритейлеров мира!
Описание вакансии
About Upshop:
Upshop is the foremost provider of a SaaS platform designed to streamline forecasting, ordering, production, and inventory optimization processes for food retailers. Its unified platform simplifies and enhances associate tasks, promoting smarter and more interconnected operations across Fresh, Center, DSD, and eCommerce departments. With over 450+ retailers and 50,000+ stores relying on its mission-critical operations platform globally, customers have witnessed substantial enhancements in sales, shrinkage reduction, food safety, and sustainability throughout their stores.
At Upshop, we believe that great businesses are built by great people. Our People function is at the heart of our company’s growth, ensuring we attract, develop, and retain A Players who drive our mission forward.
Our Values:
- Extremely Accountable
- Customer Obsessed
- Always Innovating
- Demand Excellence
- Biased for Action
Data Engineering Manager
Role Overview
We are seeking an experienced Data Engineering Manager to lead the technical evolution of our Databricks Lakehouse. You will be responsible for the architecture, scaling, and reliability of our data products, ensuring that the platform provides high-quality, actionable data for retail and supply chain analytics. This is a leadership role that requires a balance of advanced systems design and hands-on operational rigor.
Key Responsibilities
🔹 Architecture & Platform Ownership
- Lakehouse Strategy: Own the end-to-end design and optimization of the Databricks Lakehousearchitecture, leveraging Unity Catalog and Delta Lake for multi-tenant isolation.
- Scalable Ingestion: Drive multi-tenant ingestion strategies from Azure Table Storage, Cosmos DB, and APIs, focusing on watermark logic and idempotent pipelines.
- Framework Development: Lead the evolution of in-house PySpark-based ETL frameworks to support complex transformations and downstream consumption.
🔹 Operational Excellence & Data Observability
- Data Quality Validation: Implement comprehensive data validation frameworks and automated checks to ensure accuracy, completeness, and consistency across the Lakehouse.
- Observability & Monitoring: Drive the strategy for end-to-end data observability, using monitoring and alerting to proactively identify pipeline drifts or failures before they impact stakeholders.
- FinOps & Performance: Improve cost transparency through DBU tracking and compute profiling (Serverless vs. Classic); drive performance tuning via clustering and partitioning strategies.
🔹 DevOps & Technical Governance
- Deployment Integrity: Oversee CI/CD pipelines and Databricks Asset Bundles (DABs) to maintain parity between UAT and Production environments.
- Governance Standards: Implement version control, rollback strategies, and deployment governance to ensure a stable and auditable production environment.
🔹 Cross-Functional Leadership
- Strategic Partner: Partner with BI, Data Science, and Product teams to bridge business requirements into scalable, production-grade data solutions.
- Team Growth: Mentor and lead a distributed team of engineers, focusing on high-velocity delivery and technical excellence.
- Tenant Logic: Support customer onboarding by designing tenant-specific transformation logic that scales without increasing architectural complexity.
Qualifications
- 8+ years in Data Engineering; 3+ years in a leadership/management role.
- Expert in Workflows, Unity Catalog, and Delta Lake.
- Deep proficiency in PySpark, SQL, and Azure Data ecosystem (Storage, Cosmos DB).
- Proven experience managing multi-tenant or enterprise-scale platforms.
- Strong background in CI/CD, Infrastructure-as-Code, and Data Modeling.
Preferred Skills
- Experience with retail or supply chain data forecasting.
- Expertise in optimizing serverless compute environments.
- Experience maintaining production SLAs for customer-facing data dependencies.
Создайте идеальное резюме с помощью ИИ-агента

Навыки
- Azure
- SQL
- CI/CD
- Infrastructure as Code
- FinOps
- PySpark
- Delta Lake
- Databricks
- ETL
- Data Modeling
- Unity Catalog
- Azure Cosmos DB
Возможные вопросы на собеседовании
Проверка опыта проектирования сложных архитектур в Databricks.
Опишите ваш опыт внедрения Unity Catalog для обеспечения изоляции данных в мультиарендной среде. С какими основными сложностями вы столкнулись?
Оценка навыков управления затратами на облачную инфраструктуру.
Какие стратегии FinOps вы использовали для оптимизации потребления DBU и выбора между Serverless и Classic вычислительными ресурсами?
Проверка понимания надежности данных.
Как вы выстраиваете стратегию Data Observability для проактивного обнаружения дрейфа данных или сбоев в пайплайнах до того, как они затронут бизнес-пользователей?
Оценка лидерских качеств и процессов разработки.
Как вы организуете процессы CI/CD и использование Databricks Asset Bundles (DABs) для обеспечения идентичности сред UAT и Production в распределенной команде?
Проверка умения работать с конкретными технологиями Azure.
Расскажите о вашем опыте реализации идемпотентных пайплайнов при инжестии данных из Cosmos DB и Azure Table Storage.
Похожие вакансии
Engineering Manager
Engineering Manager - Adaptive Telemetry | USA | Remote
Head of Solution Engineering
Тимлид С++
Engineering Manager, Machine Learning (Caper)
VP, Engineering
1000+ офферов получено
Устали искать работу? Мы найдём её за вас
Quick Offer улучшит ваше резюме, подберёт лучшие вакансии и откликнется за вас. Результат — в 3 раза больше приглашений на собеседования и никакой рутины!
- Страна
- Канада