• 0 Posts
  • 56 Comments
Joined 2 years ago
cake
Cake day: March 24th, 2022

help-circle
  • This is actually a good take. Kids aren’t miniature adults, they’re kids. They’re not helpless or useless, but neither are they fully morally and emotionally developed. They need guidance. Plenty of adults can’t responsibly handle internet access. I survived early onilne porn and gore and social media, but it’s not like any of it benefited me in a meaningful way.

    Some folks have an attitude that’s like “I touched hot stoves and I learned better”, but that’s far from ideal.








  • And a transparent price up front.

    It’s annoying enough to get in a vehicle and not know how much it’ll cost by the end of the trip (would you do this on a bus? Would you let an airline change the price of a ticket mid-flight?), but there’s something viscerally galling about watching some asshole take a longer route just to pad out the fare. Last I checked, when Lyft or Uber gives you a price, that’s the price.






  • the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.

    First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

    But also, AI systems can blend multiple elements together. They don’t need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.