Did Signal roll their own encryption? I am unaware of this if so. Even if it is the case, it has been audited heavily, something which telegram have repeatedly either failed to do or moved the goalposts every time it has been audited. Telegram is not a secure messenger.
I think I was explaining why people could see it as bad, not that I particularly want more global moderation. Having said that, there should probably be a way to throw off people, on any platform, who actually do material harm to other individuals, such as distributing CSAM.
Yeah. Why use X3DH when there are algorithms that already exist and we know are secure?
So in which direction you want it go? More private or more moderated?
Privacy is good, but when the public chatrooms are distributing child porn, you can’t use encryption as an excuse not moderating. Failure to moderate illegal content is a crime.
Let the pedos run their own Matrix server or something. You can’t be knowingly providing comms and distribution to child pornographers.
But where do you draw the line between catching these people and not invading the privacy of every single user of the software? Because so far no one has found a solution despite a decade or more of attempts.
Signal is bad then?
So in which direction you want it go? More private or more moderated?
Did Signal roll their own encryption? I am unaware of this if so. Even if it is the case, it has been audited heavily, something which telegram have repeatedly either failed to do or moved the goalposts every time it has been audited. Telegram is not a secure messenger.
I think I was explaining why people could see it as bad, not that I particularly want more global moderation. Having said that, there should probably be a way to throw off people, on any platform, who actually do material harm to other individuals, such as distributing CSAM.
Yes. Even entire new algoritgm - Double Ratchet.
Okay, interesting. I still think my points about it stand though.
Yeah. Why use X3DH when there are algorithms that already exist and we know are secure?
Privacy is good, but when the public chatrooms are distributing child porn, you can’t use encryption as an excuse not moderating. Failure to moderate illegal content is a crime.
Let the pedos run their own Matrix server or something. You can’t be knowingly providing comms and distribution to child pornographers.
Are you saying Signal uses bad encryption? I genuinely am not sure if this is sarcasm or genuine.
But where do you draw the line between catching these people and not invading the privacy of every single user of the software? Because so far no one has found a solution despite a decade or more of attempts.