Data Engineer SSr (GCP) CDMX
¿Te interesa esta posición?
Postúlate ahora y forma parte de Globant
Descripción del puesto
## **Data Engineer SSr – GCP**
At Globant, we are working to make the world a better place, one step at a time. We enhance business development and enterprise solutions to prepare them for a digital future. With a diverse and talented team present in more than 30 countries, we are strategic partners to leading global companies in their business process transformation.
We are looking for a **Data Engineer (SSr)** who shares our passion for innovation and change. This role is critical to designing and evolving scalable data solutions in a cloud environment for one of our key clients in the financial industry.
### **What will help you succeed:**
- Experience working with **cloud data platforms (GCP)**
- Knowledge of **Data Warehouse and Data Lake architectures**
- Hands-on experience with **ETL / ELT processes and data pipelines**
- Working knowledge With **SQL**
- Programming experience in **Python and/or Java**
- Experience with tools such as:
- BigQuery
- DataFlow
- Composer / Airflow
- Pub/Sub
- Cloud Storage
- Understanding of **data modeling and scalable data solutions**
- Experience working with **code repositories and best practices**
- Exposure to **infrastructure as code (Terraform is a plus)**
### **Nice to have:**
- GCP certifications (e.g., Professional Data Engineer)
- Basic English level (A2 or above)
### **Additional details:**
- Location: **Mexico City (Hybrid model)**
At Globant we believe that an inclusive culture and a diverse environment makes us stronger. We encourage people to have an inclusive spirit as our global footprint expands. We seek to generate a place of inspiration and growth for everyone. A safe space, based on equity as a value, where everyone's careers can be promoted and developed in the same way. There is no innovation without diversity and there is no improvement without plurality.
**Are you ready?**
Requisitos
- Experience working with **cloud data platforms (GCP)**
- Knowledge of **Data Warehouse and Data Lake architectures**
- Hands-on experience with **ETL / ELT processes and data pipelines**
- Working knowledge With **SQL**
- Programming experience in **Python and/or Java**
- Experience with tools such as:
- Composer / Airflow
- Cloud Storage
- Understanding of **data modeling and scalable data solutions**
- Experience working with **code repositories and best practices**
- Exposure to **infrastructure as code (Terraform is a plus)**
Tech Stack
Beneficios
Note: El contenido del trabajo se muestra en español como lo proporciona el empleador.