Ollama recently became a flatpak extension for Alpaca but it’s a one-click install from the Alpaca software management entry. All storage locations are the same so no need to re-DL any open models or remake tweaked models from the previous setup.
It’s me, Smee
Ollama recently became a flatpak extension for Alpaca but it’s a one-click install from the Alpaca software management entry. All storage locations are the same so no need to re-DL any open models or remake tweaked models from the previous setup.
Or if using flatpak, its an add-on for Alpaca. One click install, GUI management.
Windows users? By the time you understand how to locally install AI, you’re probably knowledgeable enough to migrate to linux. What the heck is the point of using local AI for privacy while running windows?
I’ve been on lineage for ages and recently tried out /e/, was pleasantly surprised. Reminded me of a reskinned lineage with some FOSS/F-droid apps integrated into the system and some extra privacy stuff.
I particularly like the fake location and app tracker features.
When it comes to standardisation, there’s a minimum defaults-based system called GSI where the same distribution works across a lot of devices. But minimum defaults leaves a lot of devices specific features dead in the water. It’s more for development than distribution.
How much privacy respect do you feel you need? Self-hosting that stuff is the holy grail.
Fossify Calendar is a standard android calendar that pulls from the built-in android calendar storage. You can use any standard calendar like Fossify or Etar to mention two open source apps, much in the same way that you sync your nextcloud contacts to the built-in android contact storage. Any standard contacts app can interact with your nextcloud-stored’n’synced contacts.
Running Nextcloud for tasks, calendar and contacts on a self-hosted server is one of the more private solutions.
So you’re disappoint it wasn’t a rickroll of the original image?
Finally my moment to shine with incredibly niche knowledge!
Joplin, while it has the ability to encrypt the sync target (even if it’s a local folder synced with syncthing) does decrypt the content in the app data folder. The notes are in an unencrypted database while all attachments just hang out in the attachment folder.
This leaves the content vulnerable if the computer is compromised. But then again, apps that keep stuff encrypted at rest still have to decrypt it to memory - leaving the content vulnerable if the computer is compromised. 🤷♂️
All in all, Joplin is definitively one of the great, more secure note taking apps.
I always wondered why there weren’t any nuke batteries available. We have had the technology for decades.
.deb
$ /opt/camelchat/camelchat
opt/camelchat/camelchat: symbol lookup error: /opt/camelchat/camelchat: undefined symbol: g_once_init_enter_pointer
Appimage
Set as executable.
$ ./Camel-Chat-0.2.0-x86_64.AppImage
tmp/.mount_Camel-7OCAAq/camelchat: symbol lookup error: /tmp/.mount_Camel-7OCAAq/camelchat: undefined symbol: g_once_init_enter_pointer
From a similar issue for a different app it seems to be a glib issue, requiring glib 2.8+ when Debian12 is shipped with 2.74.6-2+deb12u5.
Android
Works perfectly!
I did run the app image in terminal and got an error about missing something, and the app image comes with everything bundled right? Might be an issue on my end I suppose.
I’ll redo it and paste the error message tomorrow.
Looks interesting but can’t get it to run on Debian 12, neither the .deb nor the appimage.
I think it’s a rather human thing to do, explaining oneself when refusing to lend a hand. I also think warning friends, family and those I care for about what I consider dangerous to be quite nice.
I suppose we just have to agree to disagree about how we interact and care for our fellow humans.
Sure, many jobs require people to use unethical, but legal tools and methods. I used to work somewhere like that - and what ever OS they had installed was irrelevant because it was their equipment, their systems and I didn’t use work stuff for personal things.
But really, are you describing someone dependent on a job or someone dependent on certain windows-exclusive applications? Are they forced to use their personal equipment for work stuff?
I’d ask them to consider their dependence on windows only apps. If they want help with installing linux I’m happy to help, but I’m not an enabler there to help perpetuate their negative dependence.
I refuse to help people with installing any versions of windows out of ethical reasons, just like some tattooist refuse to tattoo people in the face or give them similarly stupid tattoos.
Yes, I’m arrogant enough to think I know better than those who haven’t considered the issue at all.
It’s possible to run local AI on a Raspberry Pi, it’s all just a matter of speed and complexity. I run Ollama just fine on the two P-cores of my older i3 laptop. Granted, running it on the CUDA-accelerator (GFX card) on my main rig is beyond faster.