Just a heads up to Friedman and the other site admins that Congress passed a law recently that would require sites to pull down AI generated explicit deep fakes. https://www.congress.gov/bill/119th-congress/senate-bill/146/text/ih https://www.cbsnews.com/news/house-take-it-down-act-vote-deepfake-pornography-victims/ There are kind of a lot of them on this site, including this forum, so you'll wanna get out in front of this.
I am sure that Friedman would do that without any legislation. But chances are that there will be around zero such requests, so there is nothing to worry about or discuss.
I'm sure he would, but it makes posting those deepfakes into a federal crime, and you don't want to be providing a forum for that. Consider the headaches he had just from legal stories.
As far as I see it, it's kinda like the DMCA, so the site needs to implement a mechanism (within 1 year) that allows for the takedown on request. This might be a bigger issue for people who create and post such content.
This law isn't created to hunt those who make bad deepfakes of Emma Watson even if it makes it even more illegal than before.
Friedman had to hide a bunch of story categories, add a word filter, and an account age filter just to satisfy payment processors about content that was legal. This law makes deepfakes criminal. The stakes are more significant.
Your own link says what is illegal for websites - to not delete deepfakes in 48 hours after a request. Before that, there is nothing illegal in hosting deepfakes. Users are the ones who are responsible. And again, while this law technically applies to celebrities, it is not about them. You are creating a problem out of nothing. It is like claiming that Freidman must delete all fan-fiction because it is illegal and breaks either copyright or personality rights or both. Otherwise payment processors may dislike it.
That's literally what happened to the NonCon, Incest, and Mind Control categories. Friedman had to modify the site in multiple ways to avoid the payment processors yanking their funds. Considering there are forums on this site filled with deepfakes and stories specifically geared towards posting them, I think it's risk the site admins should take seriously.
Probably pretty relevant that the site doesn't really actually host many of the actual images used. Just thumb nails basically.
You keep repeating the same reworded thing without addressing responses in any way. I get your initial argument, I don't think it is actually dangerous, I am trying to explain why I don't think so. I'll try one more time. Then I'll stop responding to the same statement - The categories you listed got hit because company policies may differ from legal ones. Also, the categories you listed are actually very illegal in a number of jurisdictions, and payment processors operate globally. - Deepfakes, aka using person likeness in porn, were already illegal. At the very least, this breaks personality rights. Just like now, Emma Watson or her representatives had full legal rights to demand removal of deepfakes with her. Payment processors didn't care. - No one really cares about deepfakes featuring celebrities (unless real media will start doing this)... It is generally accepted that being sexualized like that is a part of their job. What this new deepfake law fights is mainly stuff like posting fake porn of people you dislike to ruin their reputation.
This change may make payment processors care, even if they are not strictly speaking required to. And if they do start caring, they may make demands that have nothing to do with the content of the law. IANAL, but I wonder how this'll get past the first amendment.
I'm repeating it because you're acting like the thing that happened twice already is somehow unlikely to happen a third time. The first thing one sees when landing on CHYOA's homepage is 'Patreon pulled our funding, please sign up on SubscribeStar' SubscribeStar's Terms of Service (https://subscribestar.adult/prohibited_content) explicitly ban deepfakes: "OBSCENITY, ABUSE, HARMS 3) Synthetic media and deepfakes, whether created with or without the use of Artificial Intelligence (aka, “AI”). " The US is naturally the largest market for an English-language adult site, and so being illegal in the US is a bigger deal. They were a civil tort as opposed to a criminal charge. That's an extremely significant difference. And notably, payment processers *do* care about illegal activity, as you noted exactly one bullet higher. The law says what it says. I'll note specifically that Friedman or an admin have to specifically and manually approve all new stories which appear on the site. This is not Youtube or Reddit or Meta getting terabytes of new content every day. The CDA Sec 230 exceptions may not apply here.