Sr Data Engineer
Ria Money Transfer
Fecha: hace 2 días
ciudad: Santiago, Región Metropolitana
Tipo de contrato: Tiempo completo

Description
Data Engineer (Python Developer)
Ria Money Transfer, a business segment of Euronet Worldwide, Inc. (NASDAQ: EEFT), delivers innovative financial services including fast, secure, and affordable global money transfers to millions of customers along with currency exchange, mobile top-up, bill payment and check cashing services, offering reliable omnichannel experience. With over 600,000 locations in nearly 200 countries and territories, our purpose remains to open ways for a better everyday life. We believe we can create a world in which people are empowered to build the life they dream of no matter who they are or where they are. One customer, one family, one community at a time.
About This Role
We are looking for an experienced Data Engineer to join the Data team. If you are easily excited by raw data and the boundless potential to transform it into useful shapes and forms, this is your dream job. Couple that with a best-in-class cloud data architecture, and you are off to the races. You will be supporting a data ecosystem that is fueled by 280 million annual visitors to our web and app properties. Combine this with an enterprise-grade events-based stack and the opportunities really are limitless. This role will give you the unbridled freedom to explore your data passions to connect disparate data sources together to create meaningful business insights and drive rapid product improvement and growth. If you’re also an API guru and love designing cloud-native services, this is your dream job.
Responsibilities
Data Engineer (Python Developer)
Ria Money Transfer, a business segment of Euronet Worldwide, Inc. (NASDAQ: EEFT), delivers innovative financial services including fast, secure, and affordable global money transfers to millions of customers along with currency exchange, mobile top-up, bill payment and check cashing services, offering reliable omnichannel experience. With over 600,000 locations in nearly 200 countries and territories, our purpose remains to open ways for a better everyday life. We believe we can create a world in which people are empowered to build the life they dream of no matter who they are or where they are. One customer, one family, one community at a time.
About This Role
We are looking for an experienced Data Engineer to join the Data team. If you are easily excited by raw data and the boundless potential to transform it into useful shapes and forms, this is your dream job. Couple that with a best-in-class cloud data architecture, and you are off to the races. You will be supporting a data ecosystem that is fueled by 280 million annual visitors to our web and app properties. Combine this with an enterprise-grade events-based stack and the opportunities really are limitless. This role will give you the unbridled freedom to explore your data passions to connect disparate data sources together to create meaningful business insights and drive rapid product improvement and growth. If you’re also an API guru and love designing cloud-native services, this is your dream job.
Responsibilities
- Design, develop, and maintain high-performance API services to address data needs
- Build and maintain ETL pipelines using Python, leveraging AWS ECS for real-time processing and AWS Glue and Lambda for batch processing
- Manage raw and processed data in Amazon S3, supporting data lake architecture
- Query and analyze data using Amazon Athena for serverless analytics
- Oversee relational data stores with Amazon RDS (PostgreSQL, MySQL, etc.)
- Collaborate with data scientists, engineers, and architects to create data-driven solutions
- Optimize and monitor all components for cost efficiency, performance, reliability, and security
- Implement best practices in cloud-native development, CI/CD, and automation
- Degree in Computer Science, Software Engineering, or related field
- Proficiency in Python with experience in object-oriented and service-oriented development
- Strong knowledge of SQL, data warehousing, and managing structured and unstructured data
- Excellent communication and collaboration skills working with internal stakeholders
- Comfortable working in an Agile environment, using Jira for project management and sprint tracking
- Nice to have: Experience with AWS Glue, Lambda, S3, Athena, RDS, DynamoDB and ECS
- Nice to have: Familiarity with containerization technologies such as Docker and Docker Compose
- Nice to have: Knowledge of data lake concepts, cloud security best practices, and compliance standards
- Nice to have: A strong understanding of privacy, data security, and regulatory compliance
- Annual salary increase review (for eligible employees)
- End of the year bonus (for eligible employees)
- ESPP (Employee Stock Purchase Plan)
- Paid day off for birthday
- 15 day vacation per year (counted as business days, Monday to Friday)
- Insurance guaranteed for employees (Health, Dental, Life Insurance)
- No fee when using RIA service/wire transfers