I guess previous week Sunday was the last ice swim of this season. During the week before the temperatures were really cold but on the weekend they were already above freezing and the ice was brittle. We reopened the path we cut the weekend before to take a dip.


In the afternoon it but turned above 14ºC and we could witness that one of our two beehives seem to have survived the winter. The bees were coming out and cleaning their hive.

Local LLM setup
Made progress with my local LLM setup. Finally got the RAG properly working with my documents. I’m using Open WebUI as interface for the Ollama installation on my machine. For the RAG setup I use the Ollama model bge-m3:latest
to generate the embeddings of the documents. Also tweaked the values of Top K to 10.
The biggest difference seems to make the System prompt and the query prompt. I’ve created custom models by combining an existing model plus my knowledge objects (the documents I’ve uploaded with the embeddings generated as described above) and a custom system prompt.
The system prompt explains to the model what it is and also especially the structure of the knowledge objects. This way the model has a better “idea” of what data it’s dealing with and can respond more accurately to the queries later on.
Dinner
Wednesday I was at a friends place for dinner. Was meant to be a bigger round at first, but eventually most others had to bail out for being sick. So just my friend with her husband an me. Was still a great evening with delicious food and lots of conversations. Learned a lot about onions 😄
I’m walking
Trying to walk to the office more often now. Roughly takes me one hour to get there and is a nice exercise. According to my fitness tracking that’s spending almost as much calories as going to the gym. So figured I can just do that and have a nice brisk walk twice a day and also take pictures around the city.



On the weekend I took a longer bike ride in the countryside to the forest with our little pond. Also started a rewrite of my Pixelfed Bulk Uploader in Rust. Figured that trying to write the whole thing with tools like ChatGPT or Claude doesn’t give me the clean code I like. I spent too much time explaining to the LLM what I want to achieve and an equal amount debugging the output from the LLMs.