.00.04_PM.png)
Descripción del puesto
Own and evolve the company data architecture, including warehouse structure, schema design, and data modeling standards
Build and maintain a semantic data layer that translates raw data into consistent, business-ready metrics using dbt
Design, build, and monitor reliable data pipelines and ETL processes that deliver data to downstream consumers
Implement data quality checks, observability, and alerting so issues are caught before they reach business users
Build and maintain a company-wide data catalog, driving data governance, lineage, and ownership practices
A clear company data catalog exists and is actively used by internal teams
Data pipelines are reliable, monitored, and failures are caught before reaching stakeholders
Business metrics are standardized and consistently defined across tools and teams
Data quality issues are caught early through automated checks and alerting
Requisitos
- 3+ years of experience in data engineering, analytics engineering, or data platform development
- Strong proficiency in SQL and Python
- Hands-on experience with dbt and a modern data warehouse (Snowflake, BigQuery, or Redshift)
- Experience with a pipeline orchestration tool (Airflow, Dagster, or Prefect)
- Familiarity with event pipelines or streaming data
- Strong communication skills in English