Available for new projects

Data pipelines &
AI automation
that actually ship.

I help research teams and small businesses automate their data workflows and reporting using Python and AI. For the past 6 years I've managed the full data lifecycle of an EU-funded marine conservation project in the Azores — from field surveys through to reporting for the European Commission. I now apply that end-to-end experience to build practical tools that save 5–20 hours of manual work per week.

What I build

Two core services, both focused on eliminating manual work and making data useful.

⚙️

Data Pipeline & Reporting Automation

End-to-end data workflows: collection, cleaning, validation, and automated reporting. Built for teams that spend too much time wrangling spreadsheets.

  • ETL pipeline design & implementation
  • Automated PDF/Excel report generation
  • Data quality audits & validation
  • GIS & spatial data processing
🤖

AI Agents & Document Automation

Custom AI-powered workflows that handle repetitive document and data tasks — invoices, reports, email triage — so your team doesn't have to.

  • Invoice & document extraction agents
  • Report generation from raw data
  • API integrations (Google Workspace, etc.)
  • Self-hosted or cloud deployment

Scientist who builds.
Builder who ships.

I'm a marine biologist based in the Azores, Portugal. For the past 6 years I've coordinated an EU-funded conservation project — managing field teams, overseeing data collection across multiple monitoring stations, and delivering data products and reports for the European Commission.

That work taught me something most developers don't learn: real data is messy, deadlines are real, and clients need outcomes they can act on — not elegant code no one can maintain.

I now take the automation skills I built for that project and apply them to research teams and SMEs who have the same problem: too much manual work, not enough time.

Python Pandas FastAPI Docker AI APIs GIS / Folium n8n Google Workspace
6+
Years managing EU-funded conservation data
1,200+
Field observation records processed & validated
5–20h
Manual work saved per week on average

Recent work

Real projects, real outcomes.

Marine Conservation Data Pipeline

Data Pipeline

An EU-funded marine megafauna monitoring project had data scattered across three land-based observation stations with inconsistent formats, missing records, and no standardised reporting process. Each quarterly report to the European Commission required days of manual consolidation.

I designed and built a Python ETL pipeline that ingested raw station logs, validated and cross-checked records, merged them into a master observation table, and generated eight standardised visualisation figures and a GIS map automatically.

Outcome: Quarterly report preparation reduced from 3 days to under 2 hours. 1,296+ validated records across 29 standardised fields. Zero manual merging.

Invoice Processing Automation Agent

AI Agent

A small agricultural and forestry operation was manually processing supplier invoices: downloading PDFs, extracting line items, entering data into spreadsheets, and cross-referencing with Google Sheets records. The process took 2–3 hours per week and was prone to transcription errors.

I built a FastAPI-based AI agent that monitors a Google Drive folder, extracts structured data from invoice PDFs using an AI model, validates the output, and writes records directly to Google Sheets — with a Telegram notification on completion.

Outcome: Invoice processing time reduced from 2–3 hours/week to under 5 minutes. Deployed as a self-hosted Docker container with zero ongoing maintenance.

Let's talk about your project

Tell me about the manual work you want to eliminate. I'll tell you if I can help and roughly what it would cost.

Typical first response within 24 hours. European timezone (UTC±0).

Send me an email