I am a software engineer with backend development experience in Python and Java.
My first exposure to programming was during my Master's studies, where I learned the basics of Java and numerical methods using MATLAB. I also used JavaScript to publish my Master's thesis project online. Eager to explore data-related fields, I began diving deeper into Python.
Since early 2025, I have been working on backend development focused on Artificial Intelligence (AI), primarily using Python and Java.
I strongly believe in continuous learning as a way of life and am currently expanding my skills by studying Golang in my free time.
Below are the tools and technologies I have been using both in my work and in my projects...
Beyond technical expertise, I have experience in team management and customer-facing roles, which have strengthened my problem-solving, analytical thinking, collaboration, and communication skills. I am a fast learner, highly organized, and thrive on constructive feedback.
This project implements a payment gateway and background processing system for online photo and portrait orders. Customer photos are initially cached in Redis, ensuring that no data is permanently stored until payment is successfully confirmed. Once payment is received, Celery is used to handle the storage and owner notification processes asynchronously in the background, optimizing performance and user experience. This architecture guarantees that only paid orders are processed and leverages background tasks to maintain a responsive and efficient workflow.
Backend built with FastAPI for the Python backend API, with asynchronous routes.
Frontend React using TypeScript and Vite.
PostgreSQL as the SQL database, using the Asyncpg driver.
Tests with Pytest and Testcontainers.
Pre-commit with Ruff for linting.
CI (Continuous Integration) based on GitHub Actions.
JWT (JSON Web Token) authentication.
This project aims to orchestrate an ETL (Extract-Transform-Load) with Airflow, extracting CSV files from a folder in Google Drive, transforming values, and storing them in a PostgreSQL database.
An implementation of a dashboard containing data scraped from a real estate website: Imovirtual, a Portuguese real estate website offering homes, apartments, and other properties for sale and rent. Using MongoDB as the database, it crawls raw data, cleans it, and makes it ready to be used in the dashboard.
Both the dashboard and the scripts to crawl the data were implemented using Python. The dashboard uses the Dash and Dash Bootstrap Components frameworks. To scrape the data, it uses Requests, asynchronous requests with HTTPX, and BeautifulSoup.
A simple task manager app built with Django. It allows users to create, update, and delete tasks.