Unsure if this program is actuall connecting to the internet or not....

5 respuestas [Último envío]
GNUser
Desconectado/a
se unió: 07/17/2013

Hey,

So, this will be a dumb asking for help, but please don't judge ahah.

So, I installed ollama ( from https://ollama.com/ ) to run a "ChatGPT" like locally on my machine, instead of relying in SaaSS.
However, I noticed that it was actually replying so fast to my queries that I thought "no way it can be done locally, my machine is not powerful enough".

So, I checked with lsof -i and the only running connection was:

TCP localhost:53034->localhost:11434 (ESTABLISHED)

So it seems to only connect locally and not to an external source for pulling replies.
Sure enough, I use the physical button on my laptop to disconnect networks, and it kept working.

However, I did a last test. I run "firejail --noprofile --net=none" and tried running ollama again. NOW it complained about: "Error: could not connect to ollama app, is it running?"

I could be wrong but firejail net=none option shouldn't disable localhost connections...

So, what is anyone else's reading on this?

Thanks for your patience! :)

GNUser
Desconectado/a
se unió: 07/17/2013

Well, upon further inspection I am convinced that there is no connection to outside happening.
On the other hand, the models that I can try to run (without memory issues) are either slow or dumb as a rock! So yeah.... -.-''
If anyone has any experience with this and wants to share feel free to do so.

Thanks anyway.

opal
Desconectado/a
se unió: 09/29/2024

I think you were right to test with --net=none. According to the man page loopback interfaces should be available. I am guessing ollama isn't able to operate in a namespace with only the loopback interface available. Maybe it expects the full network stack even if it doesn't use it. Who knows. However that might make for a good bug report for the ollama project.

GNUser
Desconectado/a
se unió: 07/17/2013

Actually I saw some similar issues of people trying to use firejail in this manner and not working. Firejail has a net=lo(cal) (not sure if it's the full word or not) and it doesn't seem to work either.
It's more like "firejail created its own network" and it causes issues with some programs. I don't think that's a problem for ollama. If people (like me) wish to use firejail with it, we have to make sure it works. Ollama developers never promised or advised to use it this way that this would work.

Still, ollama doesn't require internet (except for pulling the models).

Jacob K
Desconectado/a
se unió: 01/13/2022

Something I learned: if you run `./ollama-linux-amd64 serve &` then run `./ollama-linux-amd64 run qwen2:0.5b`, in the same firejail, it will work (assuming you already have qwen2:0.5b installed). If you open another firejail with --net=none and run `./ollama-linux-amd64 run qwen2:0.5b` from there, then it will not work, I think because firejails with --net=none have separate networks, so only programs from within the same network will be able to talk to each other.

GNUser
Desconectado/a
se unió: 07/17/2013

Yes, I think that's the issue here.
But in any case I have pretty much given up, the best models that run on my machine are not only slow but also not good enough for the use that I needed.
It seems GPU do a better job than CPU, I wonder if in the future there will be FLOSS alternatives that work well in Trisquel (provided better machines of course).