- Страна
- США
Откликайтесь
на вакансии с ИИ

Senior Analytics Engineer
Отличная вакансия в успешном финтех-единороге с серьезным финансированием. Предлагает полную удаленку, конкурентную зарплату, опционы и работу с современным стеком (Trino, dbt, GCP).
Сложность вакансии
Высокая сложность обусловлена требованиями к экспертному владению SQL/dbt и необходимостью обеспечивать точность данных до цента в финансовом секторе. Роль предполагает полную автономность и владение всем слоем трансформации данных.
Анализ зарплаты
Предлагаемая позиция Senior уровня в американском финтех-стартапе обычно предполагает зарплату выше среднерыночной по миру. Учитывая недавний раунд инвестиций Series D, компания может предлагать верхнюю границу рыночного диапазона для привлечения талантов.
Сопроводительное письмо
I am writing to express my strong interest in the Senior Analytics Engineer position at Alpaca. With over four years of experience in data engineering and a deep focus on the transformation layer, I have a proven track record of building scalable dbt models and managing complex data pipelines. My expertise in SQL, Python, and query optimization aligns perfectly with your mission to provide institutional-grade financial APIs.
In my previous roles, I have successfully owned data products end-to-end, collaborating with finance and operations teams to deliver high-precision reporting. I am particularly impressed by Alpaca's commitment to open-source and its robust GCP-based infrastructure. I am confident that my technical skills in dbt, Trino, and semantic layer modeling will allow me to contribute immediately to your data platform's evolution and maintain the cent-level precision your business requires.
Составьте идеальное письмо к вакансии с ИИ-агентом

Откликнитесь в alpaca уже сейчас
Присоединяйтесь к Alpaca и создавайте будущее финтех-инфраструктуры в полностью удаленной команде!
Описание вакансии
Who We Are:
Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series D funding round brought our total investment to over $320 million, fueling our ambitious vision.
Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totalling over 9 million brokerage accounts.
Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet. We're deeply committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it.
Alpaca is proudly backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator.
Our Team Members:
We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond!
We're searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply.
About the Role:
We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer. You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms.
You will join our 100% remote team and work closely with Data Engineers (who manage data ingestion) and Data Scientists and Business Users (who consume your data models). Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models. These models are critical for stakeholders across the company—from finance and operations to the executive team—and are delivered via BI tools, reports, and reverse ETL systems.
What You'll Do:
- Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics.
- Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability.
- Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products.
- Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business.
- Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve.
What You Need (Must-Haves):
- 4+ years of experience in analytics engineering or data engineering with a strong focus on the "T" (transformation) in ELT.
- Proven track record of owning data products end-to-end, applying analytics and data engineering best practices to ensure data quality, scalability, and robust data models.
- Comfortable working with ambiguity and collaborating with stakeholders to define requirements; able to take ownership with minimal oversight in a fast-paced environment.
- Experience proactively identifying and implementing improvements to data warehouse performance and ETL efficiency.
- Technical Versatility:
- Expert-level SQL and DBT skills for complex queries and data transformations.
- Proficiency in Python for transformations that extend beyond SQL.
- Hands-on experience with query optimization across OLTP and OLAP systems (e.g., Postgres, Iceberg).
- Proficiency with Semantic Layer modelling (e.g. Cube, dbt Semantic Layer).
- Experience owning CI/CD workflows and establishing team-wide standards for version control and code review (e.g., Git).
- Familiarity with cloud environments (GCP or AWS).
Nice to Haves:
- Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow).
- Domain experience for brokerage operations or a passion for financial markets and modelling financial datasets.
How We Take Care of You:
- Competitive Salary & Stock Options
- Health Benefits
- New Hire Home-Office Setup: One-time USD $500
- Monthly Stipend: USD $150 per month via a Brex Card
Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.
Создайте идеальное резюме с помощью ИИ-агента

Навыки
- Git
- Python
- SQL
- dbt
- CI/CD
- PostgreSQL
- Google Cloud Platform
- Airflow
- Apache Iceberg
- Airbyte
- Trino
- Cube
Возможные вопросы на собеседовании
Проверка глубины владения основным инструментом трансформации.
Расскажите о вашем опыте внедрения dbt Semantic Layer или аналогичных инструментов (например, Cube). Какие основные проблемы они решают?
Важно для работы с финансовыми данными, где ошибки недопустимы.
Как вы обеспечиваете точность данных до цента при обработке сотен миллионов событий? Опишите ваш подход к тестированию и мониторингу качества.
Проверка навыков оптимизации в стеке компании (Trino/Iceberg).
Какие стратегии оптимизации запросов вы применяете при работе с большими объемами данных в OLAP-системах, таких как Trino или Iceberg?
Оценка способности работать в условиях неопределенности, характерных для быстрорастущего стартапа.
Опишите случай, когда вам пришлось собирать требования у стейкхолдеров в условиях высокой неопределенности. Как вы превратили их в техническую спецификацию?
Проверка инженерной культуры и навыков DevOps в контексте данных.
Как вы организуете CI/CD процессы для dbt-проектов, чтобы минимизировать риски при деплое новых моделей?
Похожие вакансии
Data аналитик Senior
Data аналитик (Senior)
Data Analyst
Data/Product Analyst (Middle / Senior)
Middle/Senior Data/Product Analyst
Power BI Developer (Senior)
1000+ офферов получено
Устали искать работу? Мы найдём её за вас
Quick Offer улучшит ваше резюме, подберёт лучшие вакансии и откликнется за вас. Результат — в 3 раза больше приглашений на собеседования и никакой рутины!
- Страна
- США