👋 Great to have you here! I'm Eric, a Full Stack Analyst and Author of the book Getting Started with Taipy.
As a full-stack analyst, I take care of every stage of the data cycle with both technical rigor and focus on stakeholders and business needs. I design and deploy scalable pipelines through batch processing or real-time data ingestion. I also build intuitive and interactive data applications using frameworks such as Taipy.
Beyond engineering and visualization, I emphasize best practices in version control and CI/CD to ensure that workflows are collaborative, maintainable, and production-ready. I also place strong focus on documentation and technical writing, making complex processes and methodologies accessible to both technical and non-technical stakeholders.
🍷 I spent 8 years working in the wine industry, where my journey into tech began. While learning QGIS to manage vineyards, I discovered Python and quickly started developing practical solutions to automate my work. That experience sparked my passion for programming, and to this day I remain deeply interested in GIS.
🎓 I hold a MSc in Viticulture and Wine-Making, and a BS in biology. While working in the wine industry, I completed a first year of computer science, followed by numerous online courses on programming, data analysis, data science and data warehousing through platforms like Coursera, EdX, and France Université Numérique. You can find more details on my LinkedIn page.
💼 With over 3 years of experience in the data warehousing division of a large insurance corporation, I have also worked with small businesses and agricultural non-profits, where I focused on building lean but highly practical solutions. Alongside this, I’ve cultivated an independent practice as a technical writer and content creator. This mix of large-scale corporate exposure, hands-on problem solving in resource-constrained environments, and communication-focused work has made me highly versatile, and ultimately shaped me into a full-stack analyst who can adapt to different contexts and deliver end-to-end solutions.
🖊️ You can check some of my writing on Medium. I write about data analysis, Python, Data Ops, career shifting, and upcoming technologies like AI and LLMs.
🎥 You can also check my YouTube channel, I wish to create more content there, so feel free to take a look at it (and like, and subscribe!)
💻 I'm a contributor to Taipy, an open-source Python application builder. I wrote a book about it and contributed with 20+ issues.
⛰️ I'm a remote worker, and I live in rural France. I've always wanted to live in rural areas, that is why I studied wine-making in the first place. I believe remote work is one part of the solution to bring life back to rural Europe. Obviously, the solution is complex and will require lots of human action, remote workers can't be THE solution, just another brick in the wall. If this is something that resonates with you, feel free to connect with me!
📸 Besides technology, I love photography and exploring new places, especially rural areas (but I also like exploring cities, as long as I don't spend too much time in them!), those are my two main hobbies, and I combine them by taking pictures of the places I visit.
This a list of some of my technical skills. As a tech worker, I'm constantly learning and adapting.
- Main Language: Python for data analysis, visualization, machine learning, and building AI tools
- SQL: Proficient in writing complex analytical queries, including window functions
- R: Familiarity with the R programming language for statistical analysis and data visualization
- Web Development: Familiarity with the "classic" web stack (PHP + JS + HTML + CSS + SQL)
I enjoy learning new programming languages and tools, and the more I learn the more I want to keep learning. Right now, I’m especially focused on sharpening my coding practices and deepening my understanding of the stack I use every day.
The following non-exhaustive list shows some libraries I've used for some projects:
Tasks | Libraries |
---|---|
Data Manipulation & Transformation | |
Data Visualization | |
Web Scraping | |
Geospatial Analysis | |
ML, DeepLearning, LLM apps | |
Application builders | |
Others |
I have a strong background working with several types of databases, including:
- Enterprise data warehouse databases such as IBM Netezza and PostgreSQL. I'm proficient in data modelling and working with OLAP systems.
- Analytical Workflows with DuckDB. I use DuckDB for fast, in-memory analysis of tabular datasets, often leveraging Parquet files and columnar storage formats to achieve both speed and efficiency in data exploration.
- Familiar with MySQL and PostgreSQL in OLTP contexts, though my primary focus remains on OLAP systems and analytical use cases.
As a data analyst, I have experience using a variety of tools for data analysis and visualization. Some of my key skills include:
- I create custom applications using Taipy or other Python-based tools.
- Data Visualization Tools: I have used Power BI and Tableau, I'm also familiar with Tibco Spotfire.
- Spatial Analysis: I love using QGIS for spatial analysis. In fact, my interest in spatial analysis is what led me to programming in the first place!
- Excel: I am proficient in using Excel for data analysis, and I have experience developing VBA macros. I am also skilled in using Power Query to import and transform data from various sources, and I can use DAX to create calculated columns and measures.
I have experience with cloud, DevOps, and infrastructure tools that enable me to work efficiently and collaboratively. Here are some of my key skills in this area:
- Cloud Services: I have experience using cloud services such as AWS, IBM Cloud and Google Cloud Platform. I am familiar with cloud computing concepts, such as virtual machines and containers.
- Linux and Bash: I use Linux and Bash for command-line operations and to manage infrastructure and servers. A also use desktop Linux, because I like it better than Windows.
- Docker: I use Docker to containerize applications, making it easier to deploy and manage them in different environments.
- Version Control: I use Git for version control and collaboration on code projects. I am familiar with Git workflows.
- UV Environments: I'm a former Conda user (well, i still use it at work). I discovered UV and thet is just great.
I enjoy experimenting with large language models (LLMs) and building LLM-based applications. Here are some of the technologies I've used in my LLM projects:
- Access closed models like OpenAI's GPT 3.5 and GPT 4, or Gemini via API (using Langchain).
- Access Open models from Hugging Face with Transformers and Langchain. I've tested local models.
- I've used Hugging Face's transformers library for fine-tuning LLMs on specific tasks.
- I've built RAG pipelines using vector databases such as Qdrant, Chroma, or using pgvector.
If you want to contact me, here are all my social accounts:
If you want to support my work, you can follow me on Medium or on YouTube:
You can also visit my KoFi: