- Страна
- США
- Зарплата
- 225 000 $ – 300 000 $
Откликайтесь
на вакансии с ИИ

Senior Platform Engineer
Исключительная вакансия с очень высокой зарплатой, сильным социальным пакетом (включая опционы) и возможностью работать над топовым open-source проектом в сфере AI-инфраструктуры. Компания находится в фазе активного роста и имеет поддержку ведущих инвесторов.
Сложность вакансии
Высокая сложность обусловлена требованиями к глубокому знанию Python, опытом работы с распределенными системами высокой нагрузки и необходимостью участия в развитии open-source сообщества. Роль предполагает лидерство в разработке критически важного компонента платформы.
Анализ зарплаты
Предлагаемая зарплата ($225k - $300k) находится на верхней границе и даже выше среднего рыночного уровня для Senior Platform Engineer в районе Пало-Альто, что отражает высокую планку требований и важность роли.
Сопроводительное письмо
I am writing to express my strong interest in the Senior Platform Engineer position at Acryl Data. With over 5 years of experience in building production-grade distributed systems and a deep expertise in Python, I have consistently focused on creating scalable and fault-tolerant architectures. My background in designing clean APIs and managing complex data integration frameworks aligns perfectly with your mission to solve the metadata crisis for enterprise-scale AI systems.
I am particularly drawn to DataHub's role at the intersection of infrastructure and AI. Having worked with high-scale data processing and event-driven architectures, I am excited by the challenge of leading the development of the ingestion framework and schema mapping for diverse data systems. I am eager to bring my technical leadership and passion for open-source innovation to a team that is setting the standard for AI and data context platforms.
Составьте идеальное письмо к вакансии с ИИ-агентом

Откликнитесь в acryldata уже сейчас
Присоединяйтесь к команде Acryl Data и создавайте инфраструктуру метаданных нового поколения для крупнейших компаний мира!
Описание вакансии
DataHub is an AI & Data Context Platform adopted by over 3,000 enterprises, including Apple, CVS Health, Netflix, and Visa. Innovated jointly with a thriving open-source community of 13,000+ members, DataHub's metadata graph provides in-depth context of AI and data assets with best-in-class scalability and extensibility.
The company's enterprise SaaS offering, DataHub Cloud, delivers a fully managed solution with AI-powered discovery, observability, and governance capabilities. Organizations rely on DataHub solutions to accelerate time-to-value from their data investments, ensure AI system reliability, and implement unified governance, enabling AI & data to work together and bring order to data chaos.
The Challenge
As AI and data products become business-critical, enterprises face a metadata crisis:
- No unified way to track the complex data supply chain feeding AI systems
- Engineering teams struggling with data discovery, lineage, and governance
- Organizations needing machine-scale metadata management, not just human-browsable catalogs
Why This Matters
This is where infrastructure meets impact. The metadata layer you'll build will directly power the next generation of AI systems at massive scale. Your code will determine how safely and effectively thousands of organizations deploy AI, affecting millions of users worldwide.
The Role
We're looking for an exceptional Python engineer to lead development of DataHub's ingestion framework – the core that connects diverse data systems and powers our metadata collection capabilities.
You'll Build
- Scalable, fault-tolerant ingestion systems for enterprise-scale metadata
- Clean, intuitive APIs for our connector ecosystem
- Event-driven architectures for real-time metadata processing
- Schema mapping between diverse systems and DataHub's unified model
- Versioning systems for AI assets (training data, model weights, embeddings)
You Have
- 4+ years building production-grade distributed systems
- Advanced Python expertise with a focus on API design
- Experience with high-scale data processing or integration frameworks
- Strong systems knowledge and distributed architecture experience
- A track record of solving complex technical challenges
Bonus Points
- Experience with DataHub or similar metadata/ETL frameworks (Airflow, Airbyte, dbt)
- Open-source contributions
- Early-stage startup experience
Location and Compensation
Bay Area (hybrid, 3 days in Palo Alto office)
Salary Range: $225,000 to $300,000
DataHub is an equal opportunity employer committed to workplace diversity and inclusion. We provide equal employment opportunities to all employees and applicants without regard to race, religious creed, color, national origin, ancestry, disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, age, sexual orientation, military and veteran status, or any other characteristic protected by federal, state, or local law.
Benefits and Perks
We invest in people so they can do their best work and enjoy doing it. Our benefits reflect the way we build: practical, thoughtful, and designed to support long-term growth.
Competitive compensation
We offer salaries that reflect your skills, experience, and the impact you make. You bring value—we make sure you're recognized for it.
Equity for everyone
Every team member receives an ownership stake in the company. When we grow, you grow with us.
Remote Work
All roles are remote unless otherwise specified in the job description. Review the job description to confirm if the role you are interested in is remote or hybrid.
Location flexibility
Home office, coworking space, or something in between? We support your ideal setup. You’ll receive a monthly coworking stipend to use whenever you need a change of pace or in-person collaboration time.
Comprehensive health coverage
Your well-being matters. We cover 99% of medical, dental, and vision premiums employees, and 65% for dependents.
Flexible savings accounts
We offer FSAs to help cover planned or unexpected healthcare costs. You can also opt into a Dependent Care FSA to support family needs.
Support for every path to parenthood
Through Carrot Fertility, we provide inclusive fertility benefits and family-forming support. All U.S. employees have access, regardless of age, gender identity, or family structure.
Time off that works for you
We trust you to take the time you need. Our unlimited PTO and sick leave policy is designed for flexibility, rest, and real life.
Why Join Us
DataHub is at a rare inflection point: we’ve achieved product-market fit, earned the trust of leading enterprises, and secured backing from top-tier investors like Bessemer Venture Partners and 8VC. The context platform market is expected to grow from $1B to $9B in the next five years—and we’re leading the way.
By joining our team, you’ll:
- Tackle high-impact challenges at the heart of enterprise AI infrastructure
- Ship production systems that power real-world use cases at global scale
- Collaborate with a high-caliber team of builders who’ve scaled some of the most influential data tools in the world
- Build the next generation of AI-native data systems, including conversational agents, intelligent classification, automated governance, and more
If you're passionate about technology, enjoy working with customers, and want to be part of a fast-growing company changing the industry, we want to hear from you!
Создайте идеальное резюме с помощью ИИ-агента

Навыки
- Python
- API Design
- Distributed Systems
- ETL
- Airflow
- Airbyte
- dbt
- Metadata Management
- Event-Driven Architecture
Возможные вопросы на собеседовании
Проверка глубины знаний Python и умения проектировать расширяемые системы.
Расскажите о вашем опыте проектирования плагинной архитектуры или фреймворка для инъекции данных на Python. С какими сложностями вы столкнулись при обеспечении обратной совместимости?
Оценка навыков работы с распределенными системами, что критично для DataHub.
Как бы вы спроектировали отказоустойчивую систему сбора метаданных, которая должна обрабатывать события от тысяч различных источников в реальном времени?
Проверка понимания специфики продукта (метаданные и AI).
В чем, по вашему мнению, заключаются основные сложности при маппинге схем между разнородными системами (например, Snowflake и Kafka) в единую модель метаданных?
Оценка опыта работы с инструментами, указанными в бонусах.
Был ли у вас опыт работы с Airflow или Airbyte? Как бы вы интегрировали эти инструменты в экосистему DataHub для автоматизации управления метаданными?
Проверка навыков отладки в сложных окружениях.
Опишите случай, когда вам пришлось диагностировать трудновоспроизводимую проблему производительности в распределенной системе. Какие инструменты и методики вы использовали?
Похожие вакансии
Senior Software Engineer - Devx
Senior Software Engineer (DevOps)
Senior Build & Integration Technician
Staff Site Reliability Engineer, Streaming
Senior SRE - Data
Senior Software Engineer, Command Center
1000+ офферов получено
Устали искать работу? Мы найдём её за вас
Quick Offer улучшит ваше резюме, подберёт лучшие вакансии и откликнется за вас. Результат — в 3 раза больше приглашений на собеседования и никакой рутины!
- Страна
- США
- Зарплата
- 225 000 $ – 300 000 $