Senior Data Engineer – Airflow
¿Te interesa esta posición?
Postúlate ahora y forma parte de Wizeline
Descripción del puesto
**We are:**
**Wizeline**
, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.
With the right people and the right ideas, there’s no limit to what we can achieve.
**Are you a fit?**
Sounds awesome, right? Now, let’s make sure you’re a good fit for the role:
We are seeking a
**Senior Data Engineer**
with strong expertise in
**Apache Airflow**
to play a key role in a critical data platform migration initiative. In this hands-on senior role, you will help drive the transition from
**PySpark, Athena, and BigQuery**
to a modern data stack centered on
**Snowflake and dbt**
, using
**Airflow**
for orchestration. You will contribute to defining scalable data architectures, ensuring data quality and governance, and collaborating closely with cross-functional teams to shape the future of our data platform.
**Location & Work Model**
- **Location:**
Medellín or Bogotá, Colombia
- **Work model:**
**Hybrid – 3 days onsite and 2 days remote (home office)**
**Key Responsibilities**
**Migration Strategy & Execution**
- Contribute to and execute the data migration strategy from
**BigQuery, Athena, and PySpark**
to
**Snowflake**
.
- Design, develop, and maintain scalable
**ELT pipelines**
using
**Apache Airflow**
for orchestration and
**dbt**
for data transformation.
- Implement robust data validation, reconciliation, and monitoring processes to ensure data integrity and accuracy during and after migration.
**Data Architecture & Engineering**
- Design and optimize data models and schemas in
**Snowflake**
, applying best practices for performance, cost efficiency, and maintainability.
- Develop reusable, well-tested, and well-documented data transformations and pipelines.
- Apply data engineering best practices, coding standards, and CI/CD pipelines in the dbt/Snowflake environment.
- Partner with data architects, product owners, and stakeholders to translate business requirements into reliable technical solutions.
**Collaboration & Technical Leadership**
- Provide technical guidance through code reviews, design discussions, and knowledge sharing.
- Support and mentor other data engineers, fostering a culture of technical excellence and continuous improvement.
- Collaborate with analytics, data science, and operations teams to ensure data solutions meet downstream needs.
**Data Governance & Quality**
- Ensure compliance with data security, privacy, and governance standards.
- Proactively identify and resolve data quality issues, performance bottlenecks, and pipeline failures.
**Stakeholder Communication**
- Communicate technical concepts, progress, and risks clearly to both technical and non-technical stakeholders.
- Contribute to planning, estimation, and delivery of data engineering initiatives in an agile environment.
**Must-have Skills (Remember to include years of experience)**
- **Bachelor’s Degree**
in Computer Science, Engineering, or a related quantitative field.
- **7+ years of experience**
in data engineering or related roles.
- Strong hands-on experience with
**Apache Airflow**
for workflow orchestration and pipeline management.
- Extensive experience with
**Snowflake**
, including data modeling, performance tuning, and cost optimization.
- Solid mastery of
**dbt (data build tool)**
for data modeling, transformation, testing, and documentation.
- Expert-level
**SQL**
skills, including query optimization and performance tuning.
- Proficiency in
**Python**
for data processing, automation, and scripting.
- Strong understanding of data warehousing concepts,
**ELT/ETL pipelines**
, and dimensional modeling.
- Familiarity with
**PySpark, Athena, and BigQuery**
as source systems.
- Experience with
**Git/GitHub**
and CI/CD methodologies.
- Excellent analytical, problem-solving, and communication skills.
- Ability to work independently and collaboratively in a fast-paced, agile environment.
- **English level:**
Fluent (spoken and written).
**Nice-to-have:**
- **AI Tooling Proficiency:**
Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.
- Previous experience participating in large-scale cloud data migrations.
- Exposure to data governance frameworks and data quality tooling.
- Experience working closely with analytics and data science teams.
**What we offer:**
- A High-Impact Environment
- Commitment to Professional Development
- Flexible and Collaborative Culture
- Global Opportunities
- Vibrant Community
- Total Rewards
\\*Specific benefits are determined by the employment type and location.
Find out more about our culture
here
.
Requisitos
- **Location:**
- **Work model:**
Tech Stack
Beneficios
are determined by the employment type and location.
Note: El contenido del trabajo se muestra en español como lo proporciona el empleador.