Part 120: Keep track of Ollama's performance with Zabbix
First, apologies for not posting in a while, I have been terribly busy at work and at home as we have the puppy.
When playing around with different LLMs with Ollama, you might want to see how fast some model is. Ollama won't return you any performance statistics in server mode (or does it? Let me know!), so to get the statistics to Zabbix I had to be creative. For those who don't know, Ollama is a cross-platform software for running all kinds of LLMs locally.