

2·
24 hours agoOllama as a general LLM server and then LLaVa as model
HW/FW security researcher & Demoscene elder.
I started having arguments online back on Fidonet and Usenet. I’m too tired to care now.
Ollama as a general LLM server and then LLaVa as model
Matrix.
Finally it’ll be possible to ACTUALLY have your own server then too.
The AI support doesn’t hurt you if you don’t use it - and they’ve done the right thing by making sure you can do things locally instead of cloud.
Here’s what AI does for me (self-hosted, my own scripts) on NC 9:
When our phones sync photos to Nextcloud a local LLM creates image descriptions on all the photos, as well as creating five tags for each.
It is absolutely awesome.
You misspelled “deliver for Putin”. A lot of countries have laws against foreign influence attacks on their elections.