• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
  • Yes and no. I’d prefer user choice/curating your own list of instance you interact with.

    However, each community also adds further burden on moderation. The communities you allow affect the culture, and some are very clearly more trouble than others.

    My current solution would be to have multiple accounts for different sections of the fediverse. Currently I only have a generic Kbin and a Lemmy account, but if you find a Lemmy instance that’s federated with the broader free-speech spectrum without just veering into insane territory itself, I’d be interested.


  • Kbin user here. It does not federate downvotes from lemmy. So far, I have a total of two (2) downvotes and every single interaction, including the one I got downvoted for, was quite positive.

    No toxicity in normal interactions so far. The only (slightly) toxic comment sections were regarding meta topics of users complaining about toxicity elsewhere and/or wanting to defederate more communities. Even those discussions were nearly entirely polite and productive.

    The only somwhat toxic topic I participated in was when one car-enthusiast complained about the fuckcars community and got called out throughout the comment section. Piling on like that was probably not the best way and they deleted their post some time after.




  • Direct link to the (short) report this article refers to:

    https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf

    https://purl.stanford.edu/vb515nd6874


    After reading it, I’m still unsure what all they consider to be CSAM and how much of each category they found. Here are what they count as CSAM categories as far as I can tell. No idea how much the categories overlap, and therefore no idea how many beyond the 112 PhotoDNA images are of actual children.

    1. 112 instances of known CSAM of actual children, (identified by PhotoDNA)
    2. 713 times assumed CSAM, based on hashtags.
    3. 1,217 text posts talking about stuff related to grooming/trading. Includes no actual CSAM or CSAM trading/selling on Mastodon, but some links to other sites?
    4. Drawn and Computer-Generated images. (No quantity given, possibly not counted? Part of the 713 posts above?)
    5. Self-Generated CSAM. (Example is someone literally selling pics of their dick for Robux.) (No quantity given here either.)

    Personally, I’m not sure what the take-away is supposed to be from this. It’s impossible to moderate all the user-generated content quickly. This is not a Fediverse issue. The same is true for Mastodon, Twitter, Reddit and all the other big content-generating sites. It’s a hard problem to solve. Known CSAM being deleted within hours is already pretty good, imho.

    Meta-discussion especially is hard to police. Based on the report, it seems that most CP-material by mass is traded using other services (chat rooms).

    For me, there’s a huge difference between actual children being directly exploited and virtual depictions of fictional children. Personally, I consider it the same as any other fetish-images which would be illegal with actual humans (guro/vore/bestiality/rape etc etc).


  • Unsure, and depends on what counts as “playing”. My brother got an old computer for cheap a long time ago. There was a floppy disc with a game on it. I don’t have more impressions of it than walking around in some weird geometry on the black-and-white monitor. Some sort of chess-board floor I think.

    Truly gaming would probably be one of those small Tetris handhelds. I still have mine. Used to play for hours on car rides. Either that or Game Boy Pocket.