¡Es gratis publicar empleos! Envía una solicitud en nuestra página de contratación.Publicar empleo
Logo de Wizeline

Data Engineer (Snowflake)

Bogotá, Colombia
Presencial
hace 11 días
Tiempo completo
Semi-Senior (2-5 años)
Python
AWS
SQL

¿Te interesa esta posición?

Postúlate ahora y forma parte de Wizeline

Descripción del puesto

**We are:**

Wizeline, a global AI-native technology solutions provider, develops cutting-edge, **AI-powered** digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of **growth, collaboration,** and **impact.**

**With the right people and the right ideas, there's no limit to what we can achieve**

**Are you a fit?**

Sounds awesome, right? Now, let's make sure you're a good fit for the role:

## **Must-have Skills**

- Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.

- 3+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines.

- **Solid experience with data migration projects and working with large datasets.**

- **Strong hands-on experience with Snowflake, including data loading, querying, and performance optimization.**

- **Proficiency in dbt (data build tool) for data transformation and modeling.**

- **Proven experience with Apache Airflow for scheduling and orchestrating data workflows.**

- Expert-level SQL skills, including complex joins, window functions, and performance tuning.

- Proficiency in Python for data manipulation, scripting, and automation for edge cases

- Familiarity with PySpark, AWS Athena, and Google BigQuery (source systems).

- Understanding of data warehousing concepts, dimensional modeling, and ELT principles.

- Knowledge of building CI/CD pipelines for code deployment

- Experience with version control systems (e.g., Github).

- Excellent problem-solving, analytical, and communication skills.

- Ability to work independently and as part of a collaborative team in an agile environment.

- Must speak and write in English fluently; Effective communicator

**What we offer:**

- A High-Impact Environment

- Commitment to Professional Development

- Flexible and Collaborative Culture

- Global Opportunities

- Vibrant Community

- Total Rewards

- *Specific benefits are determined by the employment type and location.*

Find out more about our culture here.

Requisitos

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
  • 3+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines.
  • **Solid experience with data migration projects and working with large datasets.**
  • **Strong hands-on experience with Snowflake, including data loading, querying, and performance optimization.**
  • **Proficiency in dbt (data build tool) for data transformation and modeling.**
  • **Proven experience with Apache Airflow for scheduling and orchestrating data workflows.**
  • Expert-level SQL skills, including complex joins, window functions, and performance tuning.
  • Proficiency in Python for data manipulation, scripting, and automation for edge cases
  • Familiarity with PySpark, AWS Athena, and Google BigQuery (source systems).
  • Understanding of data warehousing concepts, dimensional modeling, and ELT principles.
  • Knowledge of building CI/CD pipelines for code deployment
  • Experience with version control systems (e.g., Github).
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work independently and as part of a collaborative team in an agile environment.
  • Must speak and write in English fluently; Effective communicator

Tech Stack

Python
AWS
SQL

Beneficios

are determined by the employment type and location.*

Note: El contenido del trabajo se muestra en español como lo proporciona el empleador.