minus-squareWyre@lemmy.worldtoAsklemmy@lemmy.ml•I'm increasingly unhappy with the limits on AI text generation and I have heard that it's not that hard to do it on a laptop oneself. What is the best path forward?linkfedilinkarrow-up6·8 months agoI’ve been playing a bit with llama2 in Ollama it does not have any restrictions perhaps using Ollama to run models locally is something that would solve some problems for you? linkfedilink
I’ve been playing a bit with llama2 in Ollama it does not have any restrictions perhaps using Ollama to run models locally is something that would solve some problems for you?