About
I am a Machine Learning Engineer with experience in designing systems in order to collect, clean, process, deploy and monitor data, in order to feed Machine Learning Pipelines to produce models. I assist my Data Team in using Clean Code practices, design fine-grained APIs, monitor microservices.
I employ tools such as Spark, Celery, Airflow to clean, prepare, store and process data, or MLFlow for ML models. To viualize data and metrics, I design straightforward dashboard using Streamlit, Dash, Kibana, Prometheus to analyze and visualise data. To develop API and ORM, I love to use the killer combo FastAPI and Postgres as a backend. To distribute messages to various consumer, I am used to get hands on Kafka. To package my work, I use of course the well known Docker.