I’ve read his book a few years ago, and he was pretty bullish on risky investments, so…
I’ve read his book a few years ago, and he was pretty bullish on risky investments, so…
First Diablo 4 and now this… Horrible company.
Lots of hardware lies about its useful capabilities.
Can you run 4k? Of course. But can you run more than 4 frames a second?
Yeah, and the same thing would happen if e.g. PII or HIPAA related would end up in trained model. The fact that some PII or health data ended up being publicly available, doesn’t mean that automatically you can process or store such data, and train on such data.
If you do stuff, earn from it, and ignore parties and their rights, you are forced to compensate. I guess it will be peanuts though.
The AI companies shown that they are incapable of regulating themselves on this topic, and so people with art at stake should force their hand.
Open source or not doesn’t matter here, what matters is the copyright. If even Disney can defend works they own (whatever their ethics), so should anyone else.
That’s exactly what’s at stake, waiting to be sufficiently litigated. And I hope that creators will win, and that they would be able to tell if they allow richest big tech companies in the world to train on their creations.
It is missing one point: as a creator, I want to be able to forbid you from training on my creations. And the only tool that could enable that is the copyright enforcement over AI training.
This is not an unusual comment section on Phoronix, to put it mildly.