About R2
R2 Capital provides flexible working capital to SMBs in Latin America. We embed our technology onto payment processors, POS systems, and marketplace apps, collecting our capital with a % of sales.
SMBs make up over 90% of companies, yet they face a trillion-dollar credit gap. At R2, we believe that small and medium businesses are the productive engine of society.
Our mission is to unlock their potential by providing financial solutions that are tailored to their needs. We are reimagining the financial infrastructure of Latin America - where SMBs financial needs are satisfied without ever having to go to a bank.
About the role
Skills: Python, SQL, Data Warehousing, ETL, Data Modeling, Amazon Web Services (AWS)
Who we are
Small and medium businesses (SMBs) make up over 90% of companies, yet they face a trillion-dollar credit
gap. At R2, we believe that small and medium businesses are the productive engine of society.
Our mission is to unlock their potential by providing financial solutions that are tailored to their needs. We
are reimagining the economic infrastructure of Latin America - where SMBs financial needs are satisfied
without ever having to go to a bank.
We integrate with some of the largest technology platforms in Latin America and embed tailored financial
services that SMBs can then leverage. We’re a data-first company. Data is the core of our product and the
lifeblood for all of our decision making. Data engineers sit at the helm of R2 by building the systems that
enable new financial opportunities to thousands of companies across Latin America.
What you’ll work on
- Design and maintain our data pipelines and data warehouse.
- Create robust ingestion pipelines across integrations with our partners.
- Build processes required for optimal extraction, transformation, and loading of data from a wide
variety of data sources.
- Implement data models in the data warehouse that standardize and denormalize data from multiple
sources.
- Strive to get everything automated by implementing and maintaining a robust data orchestration
system.
- Establish a secure data environment to comply with regulations across different countries.
- Build integration components that can be generalized across multiple platforms.
- Collaborate with our Data Scientists to evaluate and deploy frameworks for machine learning
models at scale.
Who you are
- You have at least 3 years of experience with data engineering and/or data infrastructure/platform work.
- You have experience contributing to the architecture and design of new and existing data processing systems (architecture, reliability, monitoring, and scaling).
- You have strong software engineering skills, and are very comfortable in Python.
- You are proficient in ANSI SQL to handle complex analysis, including large joins, window functions, and arrays.
- You have experience with ELT patterns and leverage SQL semantics to perform transformations within a data warehouse.
- You are excited to work with modern data orchestration systems (such as Airflow, dbt, or equivalents).
- You have worked on integration projects, and are comfortable consuming data through APIs, SFTP, or straight up CSVs.
- You have collaborated closely with analysts, data scientists and machine learning engineers as your core “customers”.
- You want to automate everything.
- You speak Spanish and / or Portuguese (and are ideally based somewhere in Latin America).
If interested, please reach out directly to Javier: js[at]r2capital.co
Technology
- Go
- React
- Python
- SQL
- AWS
- Hasura
- Docker / Kubernetes