Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions .env
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Environment variables declared in this file are NOT automatically loaded by Prisma.
# Please add `import "dotenv/config";` to your `prisma.config.ts` file, or use the Prisma CLI with Bun
# to load environment variables from .env files: https://pris.ly/prisma-config-env-vars.

# Prisma supports the native connection string format for PostgreSQL, MySQL, SQLite, SQL Server, MongoDB and CockroachDB.
# See the documentation for all the connection string options: https://pris.ly/d/connection-strings

# The following `prisma+postgres` URL is similar to the URL produced by running a local Prisma Postgres
# server with the `prisma dev` CLI command, when not choosing any non-default ports or settings. The API key, unlike the
# one found in a remote Prisma Postgres URL, does not contain any sensitive information.

DATABASE_URL="postgresql://postgres:postgres@localhost:5432/postgres"
KAFKA_BROKERS="localhost:9092"
DELAY_MS="0"
PORT="3000"
7 changes: 6 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,6 @@ typings/
.yarn-integrity

# dotenv environment variables file
.env
.env.test

# parcel-bundler cache (https://parceljs.org/)
Expand Down Expand Up @@ -102,3 +101,9 @@ dist

# TernJS port file
.tern-port

prisma/migrations

/src/generated/prisma

/src/generated/prisma
294 changes: 241 additions & 53 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,82 +1,270 @@
# Yape Code Challenge :rocket:
# Yape Code Challenge 🚀

Our code challenge will let you marvel us with your Jedi coding skills :smile:.
Microservicio de **transacciones financieras** con validación antifraude usando **Node.js, Prisma, PostgreSQL y Kafka**.
Cada transacción sigue un flujo de estados: `pending`, `approved` o `rejected`.

Don't forget that the proper way to submit your work is to fork the repo and create a PR :wink: ... have fun !!
## 📑 Table of Contents

- [Problem](#problem)
- [Tech Stack](#tech_stack)
- [Send us your challenge](#send_us_your_challenge)
1. [Prerequisites](#prerequisites)
2. [Clone repository](#clone-repository)
3. [Running Services](#running-services)
4. [Local Installation](#local-installation)
5. [Running API with Docker](#running-api-with-docker)
6. [Endpoints](#endpoints)
7. [Testing](#testing)
8. [Pending Tasks and Delays](#pending-tasks-and-delays)
9. [Swagger Documentation](#swagger-documentation)
10. [Important Modifications and Commands](#important-modifications-and-commands)
11. [Additional Resources](#additional-resources)
12. [Optional: High Volume Scenarios](#optional-high-volume-scenarios)
13. [Testing Postman](#testing-postman)

# Problem
## 🛠 Prerequisites

Every time a financial transaction is created it must be validated by our anti-fraud microservice and then the same service sends a message back to update the transaction status.
For now, we have only three transaction statuses:
* Node.js ≥ 20
* Docker + Docker Compose
* Git
* Conocimientos básicos de las API REST

<ol>
<li>pending</li>
<li>approved</li>
<li>rejected</li>
</ol>
## 🔗 Clone repository

Every transaction with a value greater than 1000 should be rejected.
Clonar el repositorio:

```mermaid
flowchart LR
Transaction -- Save Transaction with pending Status --> transactionDatabase[(Database)]
Transaction --Send transaction Created event--> Anti-Fraud
Anti-Fraud -- Send transaction Status Approved event--> Transaction
Anti-Fraud -- Send transaction Status Rejected event--> Transaction
Transaction -- Update transaction Status event--> transactionDatabase[(Database)]
```
git clone https://github.com/yaperos/app-nodejs-codechallenge.git
cd app-nodejs-codechallenge
```

## 🐳 Running Services

Iniciar los servicios: PostgreSQL, Zookeeper, Kafka

```
docker compose up -d postgres zookeeper kafka
```

## 📦 Local Installation

1. Instalar dependencias:

```
npm install
```

2. Inicializar Prisma:

```
npx prisma generate
npx prisma migrate dev --name init
```

⚠️ Si ya había datos en la base de datos:

```
npx prisma migrate reset
npx prisma migrate dev --name init
```

## 🐳 Running API with Docker

Iniciar el api: Api

```
docker compose up -d api
```

Revisar los registros de tu API (esperar el mensaje de inicio)

```
docker logs -f yape-api
```

Debería mostrar:

```
🚀 Server running on port 3100
```

Una vez iniciadas, las API estarán disponibles en:

```
http://localhost:3100/transactions
http://localhost:3100/transactions/{{transactionId}}
http://localhost:3100/api-docs
```

# Tech Stack
## 📍 Running API with Local

<ol>
<li>Node. You can use any framework you want (i.e. Nestjs with an ORM like TypeOrm or Prisma) </li>
<li>Any database</li>
<li>Kafka</li>
</ol>
Servidor normal

We do provide a `Dockerfile` to help you get started with a dev environment.
```
npm run dev
```

You must have two resources:
Servidor con 5 segundos de retraso en la validación antifraude

```
Modifique .env como sigue: DELAY_MS="0"
npm run dev
```

1. Resource to create a transaction that must containt:
Servidor predeterminado: http://localhost:3000

```json
## 📡 Endpoints

1. Cree una transacción (POST /transactions):

```
{
"accountExternalIdDebit": "Guid",
"accountExternalIdCredit": "Guid",
"tranferTypeId": 1,
"value": 120
"accountExternalIdDebit": "123-1345-1245-222",
"accountExternalIdCredit": "123-1345-1245-111",
"tranferTypeId": 1,
"value": 500
}
```

2. Resource to retrieve a transaction
Responses:

```
201 Created
{
"transactionExternalId": "d6d95759...",
"transactionType": { "name": "" },
"transactionStatus": { "name": "" },
"value": 500,
"createdAt": "2026-01-03T09:04:41.455Z"
}

```json
400 Bad Request
{
"transactionExternalId": "Guid",
"transactionType": {
"name": ""
},
"transactionStatus": {
"name": ""
},
"value": 120,
"createdAt": "Date"
"message": "Missing required fields"
}
```

## Optional
2. Verificar el estado de una transacción (GET /transactions/:transactionId):

```
200 OK
{
"transactionExternalId": "d6d95759...",
"transactionType": { "name": "TRANSFER" },
"transactionStatus": { "name": "approved" },
"value": 500,
"createdAt": "2026-01-03T09:04:41.455Z"
}

404 Not Found
{
"message": "Transaction not found"
}
```

Nota: `transactionStatus.name` puede ser `pending`, `approved` o `rejected`

## 🧪 Testing

Ejecutar pruebas

```
npm test
```

## ⏳ Pending Tasks and Delays

Simular transacciones pendientes con retraso

```
Modificar .env como sigue: DELAY_MS="0"
npm run dev
```

## 📖 Swagger Documentation

Archivo de configuración

```
src/swagger-docs/transactions.js
```

Acceder a la documentación

```
Docker: http://localhost:3000/api-docs
Local: http://localhost:3100/api-docs
```

## ⚡ Important Modifications and Commands

Cada vez que se actualiza schema.prisma

```
npx prisma migrate dev
npx prisma generate
```

Scripts en package.json

```
"dev": "nodemon src/server.js",
"test": "jest"
```

## 🔗 Additional Resources

Colección de Postman

[yape-api.postman_collection.json](./backup/yape-api.postman_collection.json)

- **Modelo de base de datos**
![Database](./backup/database.png)

- **Arquitectura del api**
![Architecture](./backup/architecture.png)

Nota: Asegúrese de que PostgreSQL y Kafka se estén ejecutando antes de usar la colección.

## 💡 Optional: High Volume Scenarios

Para escenarios con **alto volumen de transacciones**, se podrían considerar las siguientes estrategias:

1. **Bases de datos optimizadas para escritura concurrente**:
- Usar particionamiento para distribuir la carga.
- Indexar solo los campos necesarios para reducir overhead(carga adicional) en escrituras masivas.

2. **Cache para lecturas frecuentes**:
- Implementar Redis para consultas rápidas del estado de transacciones recientes.

3. **Event-driven architecture**:
- Kafka ya ayuda a desacoplar el servicio de validación antifraude y permite manejar picos altos de mensajes sin bloquear el API principal.
- Se pueden usar **topics separados por tipo de transacción** para balancear la carga.

5. **Batch processing y colas**:
- Procesar transacciones en batches cuando el volumen sea muy alto, reduciendo overhead(carga adicional) de la base de datos.


----------------------------------------------------------------------
## 🔗 Testing Postman

- **LOCAL**

![Local-create-transaction-250](./backup/Pruebas/Local/Local-create-transaction-250.png)

![Local-get-transaction-250-pending](./backup/Pruebas/Local/Local-get-transaction-250-pending.png)

![Local-get-transaction-250-approved](./backup/Pruebas/Local/Local-get-transaction-250-approved.png)


![Local-create-transaction-500](./backup/Pruebas/Local/Local-create-transaction-500.png)

![Local-get-transaction-500-approved](./backup/Pruebas/Local/Local-get-transaction-500-approved.png)


You can use any approach to store transaction data but you should consider that we may deal with high volume scenarios where we have a huge amount of writes and reads for the same data at the same time. How would you tackle this requirement?
![Local-create-transaction-1500](./backup/Pruebas/Local/Local-create-transaction-500.png)

You can use Graphql;
![Local-get-transaction-1500-rejected](./backup/Pruebas/Local/Local-get-transaction-1500-rejected.png)

# Send us your challenge
- **DOCKER**

When you finish your challenge, after forking a repository, you **must** open a pull request to our repository. There are no limitations to the implementation, you can follow the programming paradigm, modularization, and style that you feel is the most appropriate solution.
![Docker-create-transaction-850](./backup/Pruebas/Docker/Docker-create-transaction-850.png)

If you have any questions, please let us know.
![Docker-get-transaction-850-approved](./backup/Pruebas/Docker/Docker-get-transaction-850-approved.png)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added backup/architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added backup/database.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
25 changes: 25 additions & 0 deletions backup/yape-api.postman_collection.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
{
"info": {
"_postman_id": "f8d70d3f-1899-4f9e-81bd-10f083f0b15d",
"name": "yape-api",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json",
"_exporter_id": "10949300",
"_collection_link": "https://interstellar-sunset-7218.postman.co/workspace/YAPE~520c7c7a-3777-4ff8-8fcb-b7e84ce0afb9/collection/10949300-f8d70d3f-1899-4f9e-81bd-10f083f0b15d?action=share&source=collection_link&creator=10949300"
},
"item": [
{
"name": "Local",
"item": []
},
{
"name": "Docker",
"item": []
}
],
"variable": [
{
"key": "transactionId",
"value": ""
}
]
}
Loading