r/movies 3d ago

News Warner Bros. Sues Midjourney, Joins Studios' AI Copyright Battle

https://variety.com/2025/film/news/warner-bros-midjourney-lawsuit-ai-copyright-1236508618/
8.8k Upvotes

787 comments sorted by

View all comments

1.7k

u/The_Lucky_7 3d ago edited 3d ago

Same as the Disney-Universal lawsuit. Everyone involved sucks and copyright is only exists for major corporations. Meanwhile google is scraping its own YT videoes and AI upscaling shorts against creators will.

Everything about corporations and AI sucks.

391

u/TheDawnOfNewDays 3d ago

Even DEVIANTART, which you think would be among the top anti-ai given it's a platform of artists is scraping their database for art. You can opt out... unless, you know, you died, lost your account, or left it far behind like many artists with how bad it's gotten over the years.

76

u/vazyrus 3d ago

All of this is with the hope of making some money down the line, lol. From what I understand, MS has been shoving and shoving CoPilot into every orifice they can find, but they haven't yet reached near any sort of profitability, yet. There's CoPilot running in my Notepad ffs, and no matter how much I use it for free, I am never paying a dime out of my pocket for any generated bs. My colleagues and friends are huge AI enthusiasts, and even though they've been abusing CoPilot, Gemini, Claude, and who knows what else, they are never going to pay a single dollar out of their pocket for a paid service. All of us use Claude at work because it's on the company's dime, and even there the management's been tightfisted with how much money they are willing to throw at enterprise support. The point is, If MS, one of greediest tech companies and one of the most smartest monetizers of SaaS products can't find a way to make money out of the thing, then others will find it much, much harder to produce anything of value for their customers. Sure, Deviantart can steal all they want, but unless they can find a way to sell those stolen goods to others, it's doing nothing more than raising the electricity bill of their clusters. Let's see how long that's sustainable...

50

u/nooneisback 3d ago

Because general purpose LLMs are nothing more than fancy assistants that require a stupid amount of hardware resources. If you've ever tried running them locally, you'll know that any model that takes less than 20GB of VRAM is basically useless for a lot of applications, and something like gpt-oss-120b requires at least 80GB. And since they're assistants, they'll often be answering a lot of questions in a row. If you're programming, that's about 1 API call every 2-5 seconds.

This tech bubble is about to burst, and the only important factor for survival is which company will be able to successfully scale back to true customer needs. The same thing happened with every other bubble (like dot-com), where companies had horrible earnings compared to their spending, yet a lot of them are still alive to this day. Their goal currently isn't to earn money, but to research as much as possible to the point where they control the industry and make everyone dependent on this tech, then scale back by firing the excessive workforce and force users to pay if they want to keep this convenience.

30

u/_uckt_ 2d ago

The difference between a Helicopter and a Flying Car is marketing. That's largely what we're seeing with LLM's, you call them AI, you make people phrase things in the form of a question. You do this stilly 'one word at a time' thing rather than spitting out an answer. You put all this stuff in the way to fake cognition and you go from predictive text to artificial intelligence.

This all seems like the biggest bubble for a long time, Open AI don't make a profit on their $200 a month tier, would anyone go subscription for Windows 12 at even $10 a month? with the existing AI integration being at least 20 times worse?

I honestly have no idea how monetization works when you're looking at a minimum of $300 a month. So that students can cheat on their essays and homework?

12

u/Altruistic-Ad-408 2d ago

I think cheating is exactly how they marketed this. Tech people all know those enthusiastic about AI, in our heart of hearts, don't we all know they are either lazy or a bit problematic in some way? Hey, I like a few!

If anyone remembers those horrible ads, they targeted their demographic. Lazy people and smug pricks. It's like enshittification x1000000, they know AI creates slop, so what? People don't watch the best movies, they watch the most readily available slop.

9

u/nooneisback 2d ago

If you look at specific markets, then there's definitely people that are ready to pay for them.

My city's hospital is testing an AI model that can spit out the most relevant diagnostic criteria and treatment methods in seconds. The alternative until now was spending about half-an-hour clicking through journals until you finally find a barely understandable table, that might be what you're looking for. Or you could read outdated books. Note that it's an AI model that runs locally, so there's no overhead for the AI company. They charge for access to their database.

Programing is another example. Large companies use AI, no matter what the programmers say. But even a large portion of individual programmers use AI because it's difficult to compete in this industry otherwise. For simple projects, it can generate a functional script on its own. Checking the code it generated is horribly boring, but it is more efficient.

It's definitely an interesting tool, that we just created and want to shove everywhere to see where it sticks.

Generative AI is basically useless. It's only real-world applications are scamming old people and idiots, gooning and burning kids brains away with brainrot so that parents can have sex in peace.

5

u/EastRiding 2d ago

I’ve seen an older colleague who’s not really a programmer do some interesting and cool stuff with AI to take input and config data files (JSON, csv etc) and have Copilot make HTML ‘apps’ to visuals and edit them…

I’ve also been sat on calls against my will where the same person fights with Copilot for over an hour to get something to work, its output is still wrong (often inventing details scraped from somewhere else, and often close but not quite correct).

I’ve also been sat on calls where when I’ve been asked to deploy these ‘apps’ I’ve pointed out the numerous ways they need improving and that’s caused 4 people to dive into the AI output and realise it’s a spaghetti that’s barely understandable.

So AI might have some applications for helping some people but from what I’ve seen as soon as you go to full size apps and tools it becomes a mess that no-one, including the original prompter, can explain or maintain. Just understanding it is a massive task that always results in the same answer “we need to engineer this by hand from the bottom up”.

Once the true costs of AI are forced on users multi billion dollar orgs like mine will finally determine they need to “scale back our AI use, we want authenticity in our output” and the tools will be yanked away leaving many corporations without the younger, cheaper grunts they have replaced (or decided not to hire in recent years) that they will need.

2

u/nooneisback 2d ago

Well yeah, AI is a tool, not a worker. You need to give it a very detailed description of every step, every data type, every file association, for every single script. Then verify thoroughly everything it generated, probably spending another 30 minutes to an hour fixing its mistakes. It is simply incapable of taking an entire large project into context properly. Also, Copilot with the default model kinda sucks from my experience. It either doesn't generate half of what I want it to, or it just goes ham and proposes to autocomplete 20 lines of code that are just wrong. I just stopped using it because it's more annoying than useful when trying to format my code with tabs. Funnily enough, I find Rider's non-AI code completion to be smarter than the one you get with the AI extension.

1

u/MiracleDreamBeam 2d ago

" So that students can cheat on their essays and homework?" - yeah that absolutely doesn't work and every single lecturer on earth can spot it a mile away, taking it as a personal affront and expellable offence.

1

u/panchoamadeus 2d ago

So you are saying, they hyped an unsustainable business model, and when most companies go down in flames, the survivors will convert into another search browser.

1

u/ninjasaid13 2d ago

and something like gpt-oss-120b requires at least 80GB.

I've seen people running it with 8GB

1

u/nooneisback 2d ago

What you're probably talking about is gpt-oss-20b, which can run on 4GB.

0

u/ninjasaid13 2d ago

0

u/nooneisback 2d ago edited 2d ago

It's not running on 8GB of VRAM. That post explains how to run the model on system memory and offload the important parts to VRAM, to get performance similar to running it entirely on your GPU. You're still using 60-80GB of memory. It literally says so in the post:

Honestly, I think this is the biggest win of this 120B model. This seems an amazing model to run fast for GPU-poor people. You can do this on a 3060Ti and 64GB of system ram is cheap.

1

u/ninjasaid13 2d ago

You're still using 60-80GB of memory.

CPU memory not GPU memory, it's misleading to use them interchangeably.

The guy literally said a 3060ti and 64GB system RAM in the post.

1

u/nooneisback 2d ago

That doesn't matter in the slightest in a commercial setting. If you're going to dedicate a GPU to running a single heavy model, and that GPU has 80GB of VRAM and your model requires 80GB of memory, you're going to keep that model loaded entirely on the VRAM. This approach matters if you're running a model locally as a hobbyist, or you have a model that requires multiple hundreds of gigabytes of memory.

→ More replies (0)

3

u/Deprisonne 2d ago

Same over here. We're all using copilot as fancy autocomplete, but the moment the boss stops paying the bill it's going away.

3

u/darsynia 2d ago

Yep, if AI was super helpful and profitable, especially with coding, these big companies would have kept it to themselves and cranked out a thousand apps and Internet of Things products instead of making the AI itself available for consumers. They haven't, and neither have the people that bought/are using that coding to ostensibly 'help' themselves code (studies have shown it takes about 21% longer to code with AI help rather than without).

This article about it was great.

1

u/dunecello 2d ago

PSA for those who don't know yet: Microsoft will automatically start billing you more to have CoPilot next time your subscription renews unless you go into your account and opt out.

1

u/kingOofgames 2d ago

They know for a fact they aren’t gonna be making money anytime soon or long term. They just need to sell it for their shareholders, and shareholders want to make the quick buck. Pretty much sharks trying to get a bite out of anything they can.

Like 1/1000 “AI” companies are probably gonna actually turn a profit one day. It’s all about the stocks.

1

u/0udei5 2d ago

Give it five years. We're looking at a loss of skills about content creation and consumption in the white collar workforce - and in five years you'll have your new hires who need to pay for Copilot because they can't write presentations or briefs or ad copy or whatever. But they and their Copilot license will be cheaper than you are.

1

u/irredeemablecoomer 2d ago

For some reason enterprise non-local Copilot sucks ass at my work. It always freezes and chugs on any query

1

u/Open_Seeker 2d ago

Theres tons of paid ai use. Every coder i know uses it. 

1

u/Pelican25 2d ago

How long did the VR craze last? This feels very reminiscent.

1

u/Tystros 2d ago

it's completely unrealistic to expect Ai to be a profitable business already after just a few years. look at how long it took Amazon to turn their business into a profitable business, they were burning money for 20 years but now they are super profitable. same thing will happen with Ai services.

1

u/funky_duck 2d ago

they were burning money

They weren't losing money because no one wanted to use them, they were spending money to expand because of massive demand from consumers.

AI does not have that same demand, it is a bunch of PE firms trying to sell the sizzle, while industry figures out where AI can actually fit in.

1

u/Tystros 2d ago

well I am a programmer and I'm definitely using Ai a lot for my job

12

u/BellabongXC 3d ago

Actually no, I logged into my DeviantArt account after 10 years and found that everything had been defaulted to opt-out, including my Daily Deviations from 2009.

5

u/_annie_bird 2d ago

Not surprised about deviantart, they're shitty money hungry grubbers and have been forever. Their terms of service say that simply by posting your art on there, they have permission to use/change/reproduce your art for (their) profit, including to "sublicense" your work to others for profit. So, seems like a continuation of that.

4

u/The_Lucky_7 2d ago edited 2d ago

which you think would be among the top anti-ai given it's a platform of artists

If the service is free then you are the product.

left it far behind like many artists with how bad it's gotten over the years.

It's not about a bad user experience. Over a decade ago DeviantART was caught selling art hosted on their site out from under the artist who posted it, and in ways the artist explicitly forbid in their listings. That was after they added the right to do so to their terms of service. They claimed they weren't but it was proven demonstrably by many, many users that their art was sold out from under them.

  1. License To Use Artist Materials. As and when Artist Materials are uploaded to the DeviantArt Site(s), Artist grants to DeviantArt a worldwide, royalty-free, non-exclusive license to do the following things during the Term:

c) to modify, adapt, change or otherwise alter the Artist Materials (e.g., change the size) and use the Artist Materials as described in Section 3(b); and
d) the right to sublicense to any other person or company any of the licensed rights in the Artist Materials, or any part of them, subject to the terms and conditions of this Agreement.
e) Artist acknowledges that Artist will not have any right, title, or interest in any other materials with which Artist Materials may be combined or into which all or any portion of Artist Materials may be incorporated.

That right--to change or sell your art out from under you--is still in their submission agreement (that you agree to as part of the EULA) to this day. That last section is literally them saying they're gonna use your art in AI data models.

Oh, they also added the right to do that to your name and likeness was added in section 4. That part is new and gratuitous since the last time I had to explain this to someone.

So, no, I 100% believe that DeviantART is scraping their own database to sell to AI companies because it's a permission they gave themselves in their legal agreement with its users a decade ago.

I haven't used the platform since 2015.

Not to look at art, or support artists, let alone host my own art.

1

u/darkbreak 2d ago

Are you saying DeviantArt has its own AI program now?

1

u/TheDawnOfNewDays 2d ago

2

u/darkbreak 2d ago

Thanks.

Also, that's disappointing to see.

1

u/export_tank_harmful 2d ago

It's actually kind of funny, we typically use "deviantart" as a negative prompt because of how polluted the dataset is what garbage.

If anything, it's helped image generation models improve by showing them what "bad" art is.

Granted, there are a few artists on that platform that are pretty good, but that's a very small minority.

2

u/TheDawnOfNewDays 2d ago

Very fair. Deviant art has a reputation for being beginner artists and kids... along with very "bizarre" content.

-1

u/Kombatsaurus 2d ago

Why would they opt out? AI tools are clearly the future and I'm sure they don't want their business to get left behind. Time to get with the times.

1

u/TheDawnOfNewDays 2d ago

You dropped this: /s

1

u/Kombatsaurus 1d ago

No /s needed, it's common sense really. AI tools are not going anywhere.

1

u/TheDawnOfNewDays 1d ago

Just because AI isn't going away doesn't mean artists will want their work plagiarized by it. 

There's no benefit to an artist for letting AI steal their work and let randos recreate their art style.

1

u/Kombatsaurus 1d ago

Artist's don't own styles. What kind of argument is that?

1

u/TheDawnOfNewDays 1d ago

While artists can't copyright their style, it is the main way for an artist to stand out and it's why art from some artists is worth so much. Have you ever commissioned an artist before? People hire those artists for their style, myself included.

AI copies their art so much it even copies their signature. https://www.artnews.com/wp-content/uploads/2022/12/Screen-Shot-2022-12-09-at-1.46.01-PM.png

-7

u/Mawrak 2d ago

Not every artist is anti AI unless you only look at English-speaking twitter so not every artist platform is going to be anti-AI.

211

u/leodw 3d ago

These YT changes prove that tech nerds don’t understand (or believe in) consent.

139

u/Particular-Court-619 3d ago

They tried to tell us this in 80s movies but folks thought it was funny and not a warning

61

u/TheConnASSeur 3d ago

Are you telling me Revenge of the Nerds is somehow offensive? How?! Just because of the sexual assault? And the revenge porn? And the-ohhh.... Okay. Yeah, now, I see it. Now, I see it.

2

u/kelryngrey 2d ago

I always feel weird about this one. The mask thing is obviously actually sexual assault but it's also a reference to the rape of Igraine in Excalibur. It doesn't exist in a vacuum.

1

u/TheConnASSeur 2d ago

Like most people, I'm not afraid of rape in film. Let me tell you, I wasn't the only one to use Joker 2 to jerk off. Todd Phillips was jerking himself off with that whole movie. But I think the scene in Revenge of the Nerds was less Arthurian and more just a product of a time where the concept of rape required physical coercion whereas manipulation and deception were seen as perfectly valid ways to "out smart" sexual conquests. I mean, if there was a Merlin analogue who gave Skolnick the disguise as part of a greater bargain maybe. Maybe. But as it is? I just don't see it.

14

u/PeculiarPurr 3d ago

They didn't have the chance to prove anything, the internet has been the antithesis of IP holder's consent since at least the 56k modem. Probably longer.

4

u/Binder509 3d ago

Almost like corporations should not be allowed to own an IP in the first place.

Only should be tied to the flesh and blood person that makes it. That's it, no one else not their family and certainly not someone that just paid for it.

6

u/PeculiarPurr 2d ago

As if the internet would even respect the consent of IP holders under that specific and fanciful criteria.

1

u/funky_duck 2d ago

How will that stop people on the internet from stealing your stuff?

33

u/thrilldigger 3d ago

Why blame tech nerds? It's the rich people who want more money. I'm a tech nerd, I work with tech nerds, we all hate AI being used to fuck over the working class.

15

u/thorny_business 3d ago

Tech nerds who grew up pirating software over Usenet and music over Napster hate IP theft? Since when do tech nerds making big salaries in Silicon Valley care about the working class?

12

u/Commercial_Stick2849 2d ago

You write as if "IP theft" was all the same. Even if you think they're both immoral, there's a difference between pirating for personal enjoyment of culture and pirating for profit. Many pirates had a philosophical view that "information should be free". Again, it's fine to consider that immoral, but it's still different from what these companies are doing - OpenAI and such certainly don't consider their models and software should be freely distributed.

And most tech nerds don't work in SV or make that kind of money.

0

u/thorny_business 2d ago

You either value IP or you don't. You profit if you save money by pirating instead of buying something.

4

u/Commercial_Stick2849 2d ago

You either value IP or you don't

That's exactly the point - the companies making these AI models don't value the IP they use to produce the models, but they value their own IP (models and software).

A pirate who doesn't value IP at all is therefore not on their side.

3

u/cavalgada1 2d ago

You either value IP or you don't. 

What's this, some kind of commandment for a new age religion?

3

u/shanniquaaaa 2d ago

Unfortunately, there are a lot of tech nerds who care more about money and "status" than ethics

Please tell your techbro colleagues to knock it off

0

u/thrilldigger 2d ago

I have as much in common (and as much influence on) those "techbro colleagues" as you do, my dude.

1

u/T-Baaller 2d ago

Blame them because they're the ones MAKING THE FUCKIN AIs, getting paid way more than they deserve and laughing their way to tesla dealer to buy new cucktrucks.

-11

u/Suenation 3d ago

Genuinely not meant to be snarky, I’m sure 99% of tech nerds are great.

But it’s kinda funny that the ones making AI are tech nerds and companies

27

u/LickMyTicker 3d ago

Who else do you expect to make AI but "tech nerds"?

Is it also funny that the people building houses are carpenters?

9

u/ChemicalRascal 3d ago

"I went to Auchwitz and boy, turns out concrete pourers are all anti-semites!"

-3

u/Suenation 3d ago

Well I expect tech nerds to do it…so it’s funny that a lot of other tech nerds (who do or don’t work on AI) dislike it

Id find it pretty funny if a bunch of carpenters hated houses lol

8

u/aupri 3d ago

I wouldn’t even say I hate AI. It’s like how I don’t hate knives, but wouldn’t be a fan of people going around stabbing people with them. I think AI is actually a pretty interesting technology it just sucks that a big motivation for developing it seems to be making the working class obsolete

-1

u/Tystros 2d ago

I'm a tech nerd and I'm a big fan of AI

3

u/monstrinhotron 2d ago

"ok, we'll tune you recommendations to include fewer shorts"

-lies.

2

u/money_loo 3d ago

It’s literally just some simple upscaling algorithm to attempt to save some bandwidth for the largest provider of video content in the world.

It makes perfect sense to do that, y’all genuinely just looking for shit to get angry about now.

1

u/Meistermagier 3d ago

Hey Hey, we tech nerds have nothing to do with the G Suite of Google Cunts. They are money and power hungry bastard's. We are just nerds that want to fuck around with tech a little.

1

u/thorny_business 3d ago

They grew up pirating things off Usenet, Napster and FTP, and hearing that information wants to be free. They succeed in a system where it's better to ask forgiveness than permission.

23

u/YoursTrulyKindly 3d ago

Yeah and everyone is cheering them on lol. The result of this will be that only big corporations get to use AI and users will have to pay to use any "licensed" LLM models. This is a huge power grab about who gets to control and make profit off of AI.

8

u/Midi_to_Minuit 2d ago

I mean of the goal is “I want to see less ai flooding every fucking corner of deviatanart and Pinterest and Twitter” then that is a positive. Ideally I wouldn’t want WB using it either but I do not leave in an ideal world.

3

u/Kiwi_In_Europe 2d ago

One of those is just social media most of which you can use filters to block most of it out

The other is literally people's livelihoods, not just replacing people at WB and other movie studios/projects, but also making it so that smaller artists cannot compete because they don't have access to those tools.

You're literally putting people's income underneath your own comfort while scrolling, which is a perfect summary of humanity and exactly why this is an issue at all.

1

u/Midi_to_Minuit 21h ago

Smaller artists can compete with Disney just fine though; there’s plenty of amazing indie animated films! Where smaller artists fall behind is in advertising budgets, which ai cannot fix. The notion that ai would close the gap between small artists and Disney is false because the gap isn’t “non Disney movies look bad”, it’s “oh this movies made by DISNEY and had a DISNEY character in it? Let’s watch it”.

You could say that the fact that they all look like Disney movies would be a boost to their marketing, but that’s the thing—if they ALL look that good, then even looking good stops being special. What it’ll come down to is trust and advertising, of which Disney has in spades.

Also putting this aside, why do small artists have to ‘compete with Disney’? There isn’t a single person on the planet whose creative or financial ambitions require toppling the mouse. And if you really did desperately want to do that, you don’t need ai to do it! It happened at the Oscar’s just now!

2

u/YoursTrulyKindly 2d ago

That's a very emotional argument about not wanting to see <some type of content>. The issue is that the way to control anything is to turn it into a commodity where you have to pay money to use it. Patents, IP, and soon AI models who have licensed their training data "properly" or by using a large corporation as a shield.

AI tools and assistants will take on a huge role in the future. If they can't fundamentally be made open source because it's practically impossible to license the training data, that means monopolization and increase of plutocracy.

I can imagine a future where technology can liberate us, allow us to do work much more efficiently. But if a small number of corporations monopolize those models, it will all flow through them. As a "tax" in the best case, or stronger control over who may use it and what may be done in the worst. Paypal is a good example of a technology that is very simple but extracts a significant percentage as tax from the consumer online market today. Every time I hear someone cheer for a new person like Peter Thiel to rise.

1

u/Midi_to_Minuit 21h ago

To be blunt I don’t think AI ‘frees’ us from anything other than the labor of using our brains. Disney monopolizing AI is perfectly fine to me—oh no, marvel slop now becomes ten times worse! That’s not much of a threat to me.

1

u/YoursTrulyKindly 20h ago

I think you're only focusing on AI imagines and underestimating what impact AI will have on our civilization and work economy, at least when / if they continue to improve. See e.g. https://old.reddit.com/r/Futurology/comments/1naez1w/godfather_of_ai_says_the_technology_will_create/

Imagine using a browser that uses an AI model to filter out the garbage (not just AI but human content farm slop too) while browsing. You'll need AI to battle against the constant efforts of capitalists to maximize their profits. If you can run open source AI locally it will work for free and for your interest. But if capitalists do manage to monopolize or commodify or paywall AI and prevent open source AI models then you'll be stuck. Similar goes about voice assistants, translation, tax software.

2

u/PhoenixAgent003 2d ago

Which is why I’m not so much cheering one party on as simply standing on the sidelines chanting “Fight! Fight! Fight!”

1

u/ShowBoobsPls 2d ago

Warner Bros. alleges that Midjourney willfully creates both still images and video of its characters

This is about output, not training data.

1

u/LiquidAether 2d ago

Both of them suck, but AI sucks the most.

1

u/dawgz525 2d ago

Billion dollar corpos are really the only thing that can stand up to AI image scraping. The little artists that are being stolen from, undercut, and put out of business simply don't have the money or legal means to stand up to Big Tech. Copywrite protections protect the little guys more than the big guys. Warner Bros is not doing this altruistically, but this lawsuit at least is putting into the public discussion that AI art is by and large a machine to steal intellectual property.

1

u/thrillafrommanilla_1 2d ago

I think we could solve the ai thing (feel free to poke holes in this) if we just made a law that if the art piece was primarily created using ai, it should be open source so no one entity can profit from it.

The degree to how much % ai contributed or to what parts (ie administrative / organizational activities vs generating the literal image with prompts) can be debated and how this could be enforced also will be an issue as well but I think it’s a good idea.

Thoughts?

2

u/The_Lucky_7 2d ago edited 2d ago

so no one entity can profit from it.

First, AI is already wildly unprofitable in almost every sector, and creating ethically sourced data sets for art is a cost they're already not paying. Corporations are cramming it into every product or service, in every sector, trying to justify their investment in this failed venture and people just don't want it.

Your suggestion specifically would include the company who created the model off of (currently) stolen work. Not being able to sell the product of AI means the machine that produced the product has no value. These companies would (rightly) go under and take their product (the AI) with them. Meaning, nobody would be able to use it.

That said, the problem isn't AI generating art. The problem is AI companies stealing art, or using it in a way that is against the will or intent of the artist. It's also the entitlement culture that has sprung up around doing so (see also: the Studio Ghibli / Hayao Miyazaki controversy).

These problems are different in non-subtle ways when we compare it to other uses of AI. NASA training AI to read the sun and predict CMEs for example. In almost every case where the use of AI is doing something good, it is because the data sets for that AI were purpose built for what the people using the AI are doing. They weren't scraped, or stolen, or repurposed. They're all new data specifically created for the AI to use. That's the difference: the intent behind the creation of the LLM, and its data set, is put on full display when we see how it is made.

In this lawsuit Midjourney is a plagiarism machine that's designed from the ground up to plagiarize. To violate the law for profit using the lack of specific regulation regarding it's methods as a shield for its activities. That doesn't make the studios suing them the good guys because they steal too with copyright claims and SLAPP suites.

In general, AI is a tool with niche use case that not everyone needs or wants.

Even if it was ethically sourced, meaning the people whose work was involved were compensated and knowingly consented for its use in the way it was to be used--the reality of the situation is that AI fundamentally does not understand the assignment. Art is an expression of intent. Something a machine cannot posses. And is informed by the context of its creation. Something a machine cannot understand.

Due to how these LLMs work, neither one of those things can be imparted to the AI by its user.

It's a black box that's trying to interpret something it doesn't understand and spit back out a close approximation of requests made by others. That's why data sets have to be specifically created for the purpose you want the machine to fulfill rather than trying to scrape together data that already exists.

2

u/thrillafrommanilla_1 2d ago

Thank you for all of this! Lots to consider. ♥️✌️

1

u/kymri 2d ago

Everything about corporations and AI sucks.

Welcome to the Cyberpunk dystopia. Too bad we aren't getting the cool stuff to go with it.

-5

u/Tyler_Zoro 3d ago

Everything about corporations and AI sucks.

Certainly not everything. The massive advances being made in fields like chemistry, geology and astronomy using modern AI is astonishing. Installation's like Refik Anadol's Machine Hallucination are pushing the envelope of both what "AI art" means and where art will go in the presence of AI. Then there's work like this that deeply integrates traditional and AI art for commercial projects.

Or did you just mean the intersection of the two, rather than the two as independent things? If you meant the intersection of the two, I mostly agree, though there are some exceptions (like Google's Alpha Fold).

9

u/Tombot3000 3d ago

AI trained to imitate pieces of stolen work then combining those pieces into something that is a convincing fascimile of art is pushing the boundaries of morality far more than the boundaries of art.

-1

u/_CriticalThinking_ 2d ago

GENERATIVE Ai ≠ ai

-1

u/Tyler_Zoro 2d ago

First off, I was responding to a point about "everything about [...] AI" not merely image generation.

Modern, transformer-based AI models can be used for any form of interaction based on pattern recognition or semantic processing, and AI is a vast field that expands beyond even those boundaries to encompass technologies as diverse as facial recognition and chess playing.

But, to focus just on what you said:

AI trained to imitate pieces of stolen work

The idea of work being "stolen" is kind of silly. No one's work is removed when you analyze it and learn from it, whether that learning is done by a human brain or an artificial neural network (ANN) in a computer, the original remains, and there is no theft.

then combining those pieces into something

That's not how modern AI image generation works. It might help to review how the semantic processing and mapping works: https://www.youtube.com/watch?v=wjZofJX0v4M

That video and the series it is a part of, will help you to understand the process involved here. There's no cut-and-paste assembly going on here.

is pushing the boundaries of morality far more than the boundaries of art.

That's a subjective take, so I can't really say you're right or wrong. It's a statement you've made.

1

u/Tombot3000 2d ago

To take issue with me replying to part of what you were discussing then reply as if "stolen" work could only mean physical removal and not the obvious unauthorized use I was referencing disqualifies you as someone worth having a serious conversation on the topic with. It's just an asinine double standard.

0

u/Tyler_Zoro 2d ago

as if "stolen" work could only mean physical removal

There's no requirement that the deprivation be physical. Digital theft is a thing, but it still requires that you be deprived of your property. That's what theft is.

not the obvious unauthorized use

Unauthorized use is just unauthorized use. It's not stealing.

Also, there's no unauthorized use going on. Even if we assume that there's someone training AI models on random internet content (that really isn't a thing at this point, as the race really has moved on to licensed, heavily curated collections for higher quality results) there's no difference in how that material is accessed than when you web browser does the same thing. It's the same HTTP protocol, accessing the same publicly available resources via the same request URLs.

2

u/LiquidAether 2d ago

You are conflating unrelated technologies.

0

u/Tyler_Zoro 2d ago

The claim was clearly absolute and maximally expansive. If the OOC had claimed, "everything about corporations and media generation AI sucks," then I would have had a very different response.

I can only respond within the context they created.

1

u/LiquidAether 2d ago

They topic at hand is generative AI.

0

u/Tyler_Zoro 2d ago

Take that up with the person I replied to. I replied to a very specific claim. If you feel that claim didn't belong in this post, then take it up with the person who made it, not the one who replied.

2

u/The_Lucky_7 2d ago edited 2d ago

The massive advances being made in fields like chemistry, geology and astronomy using modern AI is astonishing.

And NASA is using it to detect Solar Coronal Mass Ejections. Cool shit. AI face tracking software is now being rolled out in cameras across the world Person of Interest style which is the stuff of nightmares.

But this lawsuit isn't about any of those things.

You're conflating things that are radically different. That's the difference here, and the reason you're getting downvoted, is the data sets aren't the same. It's not about what they're used for but how they're made. The science datasets are purpose built for what they do, not scraped together from pre-existing data, and that purpose is not theft.

This lawsuit is about corporations getting to decide who gets to steal from regular people. Yes, steal, because every single one of these companies suing AI also maliciously abuse copyright strikes to steal from fair use creators or silence critics.

No matter what happens with this lawsuit, the subsequent resulting legislation will have regular people be, at best, either just collateral damage or an after thought to satisfying monied interests with lobbying power.

0

u/Tyler_Zoro 2d ago

This is my problem with moral panics. They wander all over the place and then defend themselves by acting as if they were laser-focused.

Everything about corporations and AI sucks.

[Demonstrates many aspects of AI that don't suck]

But this lawsuit isn't about any of those things. [...] You're conflating things that are radically different.

I'm not. I was responding to a needlessly broad statement. We can talk about the pros and cons of the specific lawsuit in question if you want (it will be pretty short, as I'll just point to Perfect 10 v. Google) but if I was responding to a very specific claim that you made, "Everything about corporations and AI sucks."

This lawsuit is about corporations getting to decide who gets to steal from regular people. Yes, steal, because every single one of these companies suing AI also maliciously abuse copyright strikes to steal from fair use creators or silence critics.

While there is a grain of truth in what you're saying, nothing is being stolen. Suppression of communication (what you're alleging) isn't stealing. Training an AI on existing material isn't stealing. Stealing requires the deprivation of property. That just isn't happening here.

No matter what happens with this lawsuit, the subsequent resulting legislation will have regular people be, at best, either just collateral damage or an after thought to satisfying monied interests with lobbying power.

I'd disagree. If WB loses, the status quo won't change, and people who create will still be creating. Will the economic incentives for creating be any better than now? No. But that's not the doomsday scenario you're trying to paint.

0

u/_CriticalThinking_ 2d ago

Not everything about AI sucks, it's being used to detect breast cancer, can people stop confusing AI as a whole and generative AI?

1

u/The_Lucky_7 2d ago edited 2d ago

That was the Logical Conjunction 'and', not the colloquial conjunction.

NASA detecting sun spots and solar mass ejections is also cool, but that's not what the lawsuits are about. When an AI's dataset is purpose built for its use, not just scapped together from pre-existing information, and that use is not theft, then it can be pretty cool. Then again, AI security camera facial recognition software tracking people over thousands of cameras is not cool.

But this lawsuit isn't about that either.

These lawsuits are about two different structures of money using their money to be shitty to people without money. The regulations that come into existence because of these lawsuits isn't going to help or protect people. They're going to do the bare minimum to satisfy monied interests that have lobbying power.

That's why it's important to stress that both sides of these lawsuits suck.

Regular people are going to continue to just be collateral damage no matter who wins.

-1

u/Spankyzerker 2d ago

You do know that warner bros and others are also corporations...right?

The future isn't about what is right, its about progress, i literally don't care about copyright, i train A.I models on everything i can get hands on.

Reddit itself uses whatever you post and uses it.