![](https://beehaw.org/pictrs/image/c0e83ceb-b7e5-41b4-9b76-bfd152dd8d00.png)
fair enough, I had not tested any other distros, my bad.
fair enough, I had not tested any other distros, my bad.
https://forums.linuxmint.com/viewtopic.php?t=343833
You can search duckduckgo for Nvidia mok secure boot mint and you’ll see what I’m talking about.
https://unix.stackexchange.com/questions/535434/what-exactly-is-mok-in-linux-for#535440
“works for me”
That’s not a protip. A protip would be how you do that :D
RTX3060, I suspect this is the case for newer laptops, yes.
Me installing Linux Mint on a 2022 laptop with a Nvidia GPU (had windows 11 preinstalled, this was an alongside install). I disabled secure boot at first, but still had to go all the way back and set up my MOK keys and turn on secure boot properly with another password to unlock the GPU.
You can’t disable secure boot if you want to use your Nvidia GPU :( though. [edit2: turns out this is a linux mint thing, not the case in Debian or Fedora]
Edit: fine, there may be workarounds and for other distros everything is awesome, but in mint and possibly Ubuntu and Debian for a laptop 2022 RTX3060 you need to set up your MOK keys in secure mode to be able to install the Nvidia drivers, outside secure mode the GPU is simply locked. I wasn’t even complaining, there is a way to get it working, so that’s fine by me. No need to tell me that I was imagining things.
Lemmy try: !internetisbeautiful@lemmy.ml
PS: works 😁
I visit some subs occasionally on old reddit, but have never logged in again. Fuck u/spez.
Considering that training is extracting the main features of a dataset, there is always some data that is discarded as “noise” in the process, then when data is generated, that discarded information is filled back with actual random noise to partially replicate the original data.
Iterate and you’re going to end up with progressively less meaningful features. I just didn’t expect it to take only 5 iterations, that’s a lot of feature loss in training even with so many parameters.
Meta: this post needs to be a community.
Well, that makes a lot of sense now :) thanks.
It’s a paid service where you can enter a premium link or torrent link to it and it will generate a direct download link. This is very useful if you visit premium sites like Mega and RapidGator where if you don’t have an account, it enforces limits such as:
Slow download speed (e.g. max 1MB per second while downloading)
Maximum number of downloads per hour (e.g. 1 file per 5 hours)
No resume support
Unable to download if file is larger than a certain amount (e.g. no more than 5GB per file allowed for non-premium users)
more on the old site: https://old.reddit.com/r/Piracy/comments/q3vqgv/introduction_to_debrid_services/
Is this topic-specific or are there other bots other than the one in UnderNet? I’ve never found on IRC a book that wasn’t in libgen
Don’t inspire fear or disgust, that’s the basics.
Thanks, I was just testing the bot, but something failed. Thanks for the tldr, yea I guess electrolysis only pays for aluminum, because there is no other way to get it :/ maybe hydrogen too.
Maybe the best option for plastic that ensures it won’t end up in nature and is not too expensive is incineration powerplants.
@AutoTLDR@programming.dev
Karma still exists even when you don’t see the number, because it is used to sort posts in some way (by number or upvote percentage). Upvotes are important information to the community in principle.
It’s not so much a dark pattern, but an emergent property of the upvote system: usually the first commenters tended to have an advantage and late good comments actually would never get enough exposure to float to the top.
Karma farmers would just sit at “new”, spam comments and get visibility for joke and outrage comments.
The solution may be to randomly order comments below a certain threshold and/or within an upvote range.
As much as I think that the Reddit and twitter soap operas are nothing but a time sink over what is a relatively simple cautionary tale, I can’t look away from this train wreck. “Y, elon?” is the new “fuck u/spez”.