AI as a tool in erotic writing

Discussion in 'Authors' Hangout' started by Hvast, Sep 12, 2023.

  1. Sthaana

    Sthaana Really Experienced

    I'm really not sure this is a good comparison.

    Digital art supplements traditional artistic skills by taking away some of the non-imagination-related drudgery (such as making symmetry, colour filling, sketching, canvas flipping etc. easier) and adding a greater element of security to the process (by allowing you to undo mistakes and keep previous or multiple versions of the same picture). A better writing equivalent to digital art would be the advent of typewriters, word processors and writing software.
    The basic process of having an image or idea in your head and using a hard-earned set of imperfect skills at manipulating flawed tools to give it shape remains the same. The thing you outsource is not the imaginative act of creation, but rather those parts of giving shape to your ideas which you find to be busywork or too costly (after all, what does ones ability to afford high-quality paints or to write a perfect first draft or else spend hours rewriting the whole thing on a new sheet of parchment really have to do with the quality of what's in your head?).

    I admit, I'm not familiar with the actual process of using GenAI, but my impression is that it simply recombines other people's "borrowed" work into something that may approximate what you had in your head, in the same way that a movie adaptation may cast actors who look like how you imagined a character looked in the book, or an order at Subway may resemble the sandwich you would have made at home. That's fine for people who mostly just consoom media, but creatives, even (or perhaps especially, I'm still not sure) amateur creatives should be really wary of this imo...
     
    Last edited: Jan 20, 2025
    Gatsha likes this.
  2. Hvast

    Hvast Really Really Experienced

    You are describing all of the art here. It is a recombination of other people's work + adding some of your own life experience. Your brain needs to take input from somewhere to produce output. You don't create something from nothing, there is no creation only transmutation and recombination. Generative AI crudely imitates what our brains do. It is worse in almost every regard except two - it produces the output FAST and it can train on an amount of data that our brains can't process.


    Let me ask a rhetorical question. Is a film director an artist? After all, directors do nothing. They only instruct actors, cameramen, prop makers, right?

    Now imagine a team of two who write a novel.

    Person number one plans a plot, chooses a style, does all of the worldbuilding, and instructs the other person what to write, paragraph by paragraph. Person number one also makes person number two rewrite what doesn't match his vision. Person number one is also not shy of editing what person two has written. Will you agree that those two people are coauthors? What if a person number two is a soulless piece of software? Will you agree that person number one is the only author of the novel?

    And yes, it partially outsources the creative writing part. But what if an amateur, do-it-for-fun, writer likes the planning and worldbuilding part but struggles through actual typing of boring parts?
     
  3. Sthaana

    Sthaana Really Experienced

    I would not agree.
    AI is trained on other artists' work, usually without their consent and often in direct opposition to their wishes.
    The only way to ethically use AI to create something for the public imo is if you are constantly open about using AI, which parts of the text are AI, what prompts you used and whose data was used to train the AI. A big, red, flashing sign that says "NOT MINE". There is no real way to enforce those rules and most people will simply not bother. Of course, they will still happily hoover up any likes, bookmarks, clout and maybe even money they may manage to get from such work...
    The "all art is collaborative/recombinatory" argument is already a massive cope. Sure, most artists begin by relying on other people's work (as evidenced by the amount of fanfiction written by teenagers), but after a while they develop their own style, their own creative ambitions and their own artistic path, or else go home and become a family man. With AI, such a development becomes much, much less likely, while the outward quality of the work and the clout remains the same. It's much easier for people to delude themselves into thinking people like their work, when really they just like the rough approximation of other artists' styles, like when Milli Vanilli thought that they could actually sing or when pro wrestlers try their hand at MMA.
    I get that the actual process of writing, drawing etc. can be frustrating, arduous and unrewarding and that sometimes you just want to outsource it to someone else. That's what commissioning real, human artists who put in the work to hone their craft and put themselves out there is for.
    I would compare AI to the use of steroids in bodybuilding, but even that comparison is weak, since with steroids the person using them inevitably pays the price (not to mention, you still have to actually lift weights like someone who doesn't juice). With AI the people who foot the bill are the people who spent years and years improving, working on themselves, going hungry, learning from their masters and trying to make something with their own two hands, only to have some rando mulch their art into a digital paste, splort out more pictures than they could make in a year and say "Yeah but my elves ride on giant floating jellyfish and have an Aztec aesthetic!" as if that puts him on an even remotely similar level.

    I know this sounds really accusatory, but I really do think that the bar for using GenAI in a remotely ethical way is so stratospherically high compared to the low bar of entry and boundless human capacity for rationalization, delusion and self-justification that it's just better to steer clear...
     
    Last edited: Jan 22, 2025
  4. Hvast

    Hvast Really Really Experienced

    Human brain is trained on other people's work, usually without their consent...

    I will die on a hill that if something is legal\moral to do with our own body, then it is legal\moral to do the same with an invented external tool and vice versa.

    I can't agree with this logic either. An experienced artist's "unique" style is still a recombination, just a selective one, more skillful. The majority of it still comes from the accumulated experience of humanity, not from the person's own life experience which is very limited. We still use same tropes, same techniques, same elements, same genres as people from books we read, films we watched, and stories we heard. Most of that borrowing may be subconscious but it doesn't mean we create ex nihilo


    Note that AI can be tweaked to produce output in a more unique style with specific prompting, LORAs, fine-tuning, and other techniques and tools. Also, if an AI output is processed by a human then they imbue it with their own style.
     
  5. Sthaana

    Sthaana Really Experienced

    I'm sorry, but that's an absurd take.
    The human brain is "trained on other people's work" by reading/looking at it, which is exactly what every artist wants people to do. I am thrilled by every like or comment on my work and if anyone were to tell me that my story got someone to start writing that would keep me warm for weeks. Publication for free is like the most enthusiastic form of consent imaginable!
    Even in the case of piracy, authors don't object to people reading the work and being inspired by it, they just object to not getting paid.
    I've yet to see a single artist on Twitter who is as hyped to see people use their art to train AI as they would be of fanart, praise or tributes (as in, I haven't seen a single artist who wasn't opposed to the idea of their art getting scraped).
     
    Last edited: Jan 24, 2025
    TheLowKing likes this.
  6. Hvast

    Hvast Really Really Experienced

    Artists often complain that someone "stole" their style, technique, ideas, etc - So no, it is wrong to say that artists are fine with human brains being trained on their work. They are fine with it only if the gained knowledge is not used to compete with them.

    Also, you picked one tiny part of my message taking it out of context. The argument was not levels of artist's approval but that training a human brain and training an AI is largely the same thing morally\legally speaking.
     
    Last edited: Jan 24, 2025
  7. TheLowKing

    TheLowKing Really Really Experienced

    The methods by which LLMs and humans learn are wholly different, and the impacts and effects are too, so treating them as if they are the same is inappropriate, even if you do think both should be legal and/or are morally acceptable.
     
    Sthaana likes this.
  8. Hvast

    Hvast Really Really Experienced

    Why would method matter? The way how bows and nuclear bombs kill are also quite different but it is irrelevant in the discussion if murder is OK.
     
  9. TheLowKing

    TheLowKing Really Really Experienced

    You're saying human learning and LLM learning are the same, for legal/ethical reasons. I'm saying they're different, because the impact and method are different. Since those aspects are (some but not all of the) major reasons for the backlash against LLMs, they seem pretty pertinent to me. You're going to have to do better than saying "those don't matter" and calling it a day.

    Now, admittedly I've also avoided taking your central argument head on, so let me do so now. It seems to me it largely hinges on these two words in the post that I quoted: "legally" and "morally".

    Morally, you are of course free to have your own opinion on LLMs. But something that you think is right and proper and self-evident may be utterly repellant to someone else, so we must look beyond the surface level "I like this" vs. "I don't like this" and inspect the underlying assumptions and values of those believes. And, I'd say, how this technology will affect the world. Will it make things better? Worse? Odds are that it'll do a bit of both (like any technology!) and so it is up to us, as a society, to shape it in a way that benefits everyone the most. And that means coming up with social norms, setting up regulations, and instituting laws. Which brings us neatly to:

    Legally, I'd argue that we haven't quite settled on a consensus yet on whether LLM learning and human learning are the same, legally speaking (hence also this discussion). But either way, laws are prescriptive as much as they are descriptive. Famously, slavery was legal, too. Now to be clear, I'm not saying slavery and LLMs are in any way equivalent; I'm just making the point that laws are human constructs, and they can be made to serve different interests. And this fairly basic fact of political theory is very clearly recognized by the "AI" industry as well; why do you think it's been cozying up to Trump so much of late? They're seeing an opportunity to gain power (or at least avoid losing power, which is essentially the same, because power is always relative), and they're seizing it with both hands.
     
    Last edited: Jan 25, 2025
    Sthaana likes this.
  10. Hvast

    Hvast Really Really Experienced

    Of course, I express my own opinions, not some stone-cold morality facts. Let me explain my stance in detail on a practical example.


    My current AI project is to train a LORA (a small model that is attached to a larger) capable of drawing mermaids. It is a lengthy multistep process but what matters is the first one. I need training data. I need images of mermaids. Guess my source of those? Googling images.

    So many artists believe that I am morally and legally obliged to check the copyright status of every image of a mermaid I use and go contact the copyright owner and pay (or get permission in another way) for the privilege of using it in training. And I think it is an utter and complete bullcrap.

    Let's start with the legal part...

    1) Do I try to copy their works aka break the copyright? No. My goal is the opposite. I want to generate unique mermaids.
    2) Do I try to copy their style? Actually, the question is irrelevant because copyright doesn't protect styles, genres, techniques, etc. Still, my goal is the opposite again. I am carefully choosing images of many different styles: movie screenshots, cosplays, pencil, digital, waterpaint. I don't want my LORA to influence the style of a generated images.
    3) Do I try to teach my model a copyrighted concept? No, mermaids aren't copyrighted. It would be different if I taught my model how to draw a specific mermaid (let's say Ariel) or a copyrighted race like a Dalek, a Klingon, a Zerg and so on. Then again, it is not much different from fan art. I am quite sure that training and using such LORA is a copyright infringement but as long as I won't try to make money on that I don't think big corps will really care. Note that I said "training and using", merely training a model that can output copyrighted characters\races are unlikely to be infringement. Why do I think so? If it was true, we would have tons of Disney\Nintendo lawers milking billions suing every AI company in existence. There is a precedent that making something that can be used to break copyright is not a violation of copyright by itself.


    And now the moral part of the question...

    Do I "steal" someone else's effort for my benefit? Well, if the answer is yes... then all artists are guilty of the very same thing. They trained their brain, their natural tool of producing images, to do exactly the same. When they want to draw a mermaid they don't magically conjure it from nothing. They use data from their brains. And a lot, a huge majority of that data came from the efforts of generations of humanity. They took it all for free and now go for "No! It is mine. I won't give knowledge back to humanity unless it pays me!" when it is not even theirs apart from a tiny part that comes from their own living experience that they should give back in return for knowledge of the past generations they have consumed. What IS theirs is the effort of making specific creative work for others to enjoy and we have copyright laws for paying them for that.
     
  11. TheLowKing

    TheLowKing Really Really Experienced

    Your conception of what copyright encompasses is too narrow, it goes far beyond the surface-level read of the "right to make a copy":
    Your example is useful, so let's look at it some more, using these definitions. You're clearly not copying pictures 1:1 and selling them as your own, or displaying them on an ad-infested site or anything like that. But you are definitely producing derivative works, and when/if you publish them, then in my view you do in fact infringe on copyright. Just because you're mixing and matching thousands of images (rather than 1-2) doesn't make them any less derivative. You have a set of inputs and a set of outputs, but there's no magic "remove copyright" step in between. So if your inputs are covered by copyright, then your outputs infringe on that copyright. Simple, right?

    Well... actually, no, it's more complicated than that: even if you painstakingly ensure that all the images you use are copyright-free, your outputs are still tainted. Because what even are your inputs? As you know, a LoRA is just fine-tuning, so it's only a thin layer on top of an existing model. The LoRA alone is useless, it generates nothing, it builds on top of a model's vast network. That underlying model was (almost certainly) trained on illegitimately acquired data, and since the model evidently has been/is being published/distributed (you got your hands on it, after all), its authors did/do infringe on copyright.

    Now we're only one small step away from what (to me) is the crux of the issue. Because even this fourth-rate, freely available model that you use to create smut for fun frankly... doesn't concern me. I just don't care.

    What I care about are the top-of-the-line models. And now we're no longer talking about hobbyists generating AI porn in their basements, nor crowd-funded operations to upgrade that fourth-rate model you use to a third-rate model. Now we're talking about the OpenAIs, xAIs, and Meta AIs of this world (as well as a whole slew of lesser known mostly Chinese companies). Companies slated to make literal trillions of dollars by robbing countless artists of their hard work, not to mention basically every person on the planet who used a web forum. The issue, to me, is that this happens on an industrial scale, and it concentrates power in the hands of a very small number of already very rich people, because training a top-of-the-line model from scratch is expensive. We're talking hundreds of millions of dollars, if not billions. Only people who are already very rich can afford it, and so only people who are already rich can benefit from it, and that shit doesn't trickle down. And even worse than that: they also have the ability to decide what kinds of results the models put out, and they've already amply demonstrated their willingness to do so.

    This is what I mean when I talk about "impacts" and "effects": such concentration of power is and always has been extremely dangerous to our liberal democratic political system and thus to our society; In fact, I'd argue it's the most dangerous thing, full stop. Cautionary tales are aplenty: from 19th century robber barons to the Nazi German industrial oligarchy, from the world-wide banking sector (we literally called it "too big to fail", holy shit!) to current-day Silicon Valley techno-fascists. These people need to be fought, tooth and nail.

    You'll note these problems simply don't occur when ordinary human beings with fingers and eyes and brains take inspiration from a Monet painting or a Bach piece or a Tolkien novel to make something new, nor even when they painstakingly 100% exactly reproduce it, nor even if they then go on to sell it for millions of dollars. That's why they should be treated differently.
     
    Last edited: Jan 26, 2025
    Sthaana likes this.
  12. Hvast

    Hvast Really Really Experienced

    Distributing, adapting, displaying, performing are all forms of copying or partially copying. Sure, unlawfully displaying a copy is not exactly copying but it is close enough. Nothing in this list goes beyond the right to copy. For example, the list doesn't include analyze, disassemble, learn from, etc.



    Quoting wikipedia - In copyright law, a derivative work is an expressive creation that includes major copyrightable elements of a first, previously created original work (the underlying work).

    What major copyrightable elements are we talking about? You won't even be able to tell what images I used in LORA that was used to generated the image. No, whatever I generate is not derivative. As I said, my goal is the opposite - make something that generates unique mermaids

    1) It is not what a model does. It doesn't store parts of thousands of images. It stores knowledge\information gained from training on thousands hundreds of millions of images. The model can't mix and match images because it doesn't contain them. Stable Diffusion models are under 10GB in size, you can't pack millions of images in them

    2) Actually, it does. For something to be derivative, you need major copyrightable elements of a creative work not elements of a creative work. Those are not the same. Just because your mermaid has big tits and my mermaid has big tits it doesn't mean that my mermaid is (partially) derivative of yours even if I got the idea of making my mermaid well-endowed after looking at yours. And even if that 1\1000th of your image is somehow copyrightable... it is not major by definition. But again, it is not what models do.

    Well, if they did infringe the copyright... tell me why Disney and other copyright-owning corporations didn't destroy creators of those models in the court? You are forgetting that most of the copyrighted stuff in the world belongs not to individuals but to corporations.
    Also, I see nothing illegitimate in looking at the publicly available data and analyzing it.

    Can you explain to me why exactly a company should pay me for this very post on the forum after I voluntarily placed it here for everyone to read? If I have a right to demand payment from a company, don't I also have a right to demand a payment from you?

    To fix this issue you want to grant immense power to companies that own copyrighted content. You know what will happen if we pass the law that everyone should ask the permission of the copyright owner to train a model on copyrighted data? You will handle sweet monopolies. Why would Disney ever allow anyone to train a model on their stuff if they can train their own model and get a huge competitive advantage? I think Disney owns so much stuff they can train models only on it, without even touching the public domain.
     
  13. TheLowKing

    TheLowKing Really Really Experienced

    You have still not provided a shred of evidence to support your claim that LLMs and human brains learn in the same way, despite my numerous requests for you to produce some.

    You are only barely nibbling at the edges of the second half of my argument, which is that we should also treat these systems differently because their effects are different; in particular: the concentration of power in the hands of the few to the detriment of the many.

    Until you do the former, or address the core of the latter argument, I see no point in making another long post.
     
  14. Hvast

    Hvast Really Really Experienced

    My claim? Perhaps I worded something in a crude way. Why would I say this absurdity? They both learn. I don't claim that they learn in the same way.


    I don't think it is very relevant. Neither I think that effects are very different.

    Only the scale of the effect is different. Just because the new technology will multiply the amount of content produced and some people will financially benefit from it a lot, it doesn't mean that producing A LOT of content is somehow illegal when producing smaller amounts of content is legal. Immorality is another, it is subjective, you may advocate for some new laws to limit the extent to which one can learn from others or exclude non-human brain ways or whatever

    But you claim robbing, (copyright infringement that you incorrectly label as robbery). And you better provide evidence that training models and subsequential generation of content breaks either existing copyright laws or the spirit of what copyright is. After all, we are several years in generative AI era, it is plenty of time for courts to come up with something if it is illegal. It didn't happen even in countries with harsh copyright laws.

    Even if we assume that it will be a (near) monopoly. The few will produce something quite useful for humanity. It is not the detriment of the many. I am not a fan of huge companies and their political power either but I prefer Microsoft and Apple existing to not having Windows and smartphones. I also prefer a monopoly power company to not having electricity at all.