mystic-macaroni@lemmy.ml to Privacy@lemmy.ml · 11 months agoCan you trust locally run LLMs?message-squaremessage-square20linkfedilinkarrow-up172arrow-down16file-text
arrow-up166arrow-down1message-squareCan you trust locally run LLMs?mystic-macaroni@lemmy.ml to Privacy@lemmy.ml · 11 months agomessage-square20linkfedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squarefruitycoder@sh.itjust.workslinkfedilinkarrow-up5·11 months agoI think you can point to a file instead too
I think you can point to a file instead too