A computer like that is useful outside of work. I’d pay for it out of pocket if I had to.
A computer like that is useful outside of work. I’d pay for it out of pocket if I had to.
They were making toys before trying out video games.
Is one of these cats just large or is this a trick of perspective?
The only thing I got from this is that bro loves ads more than anything in the world.
Excellent.
Here is a video if you want some context.
It’s a very long story with Drake. We honestly don’t have time.
I accept regulations are real, but not all ways to help people require you dealing with regulations. I’m still waiting on that proof by the way.
There are more ways to help people than making medical software. Rather than saying they could focus on doing simpler things, you automatically jumping to all projects running afoul of FDA regulations is pretty telling. All while still having not provided a single project halted by FDA order.
Which projects have been shut down by FDA order?
Open source AI is huge, and I don’t think you need FDA approval to distribute a model. Where are you even getting that from?
You’ve got the spirit.
What about open source projects?
Don’t use Drake memes.
This isn’t about research into AI, what some people want will impact all research, criticism, analysis, archiving. Please re-read the letter.
You should read this letter by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.
Why are scholars and librarians so invested in protecting the precedent that training AI LLMs on copyright-protected works is a transformative fair use? Rachael G. Samberg, Timothy Vollmer, and Samantha Teremi (of UC Berkeley Library) recently wrote that maintaining the continued treatment of training AI models as fair use is “essential to protecting research,” including non-generative, nonprofit educational research methodologies like text and data mining (TDM). If fair use rights were overridden and licenses restricted researchers to training AI on public domain works, scholars would be limited in the scope of inquiries that can be made using AI tools. Works in the public domain are not representative of the full scope of culture, and training AI on public domain works would omit studies of contemporary history, culture, and society from the scholarly record, as Authors Alliance and LCA described in a recent petition to the US Copyright Office. Hampering researchers’ ability to interrogate modern in-copyright materials through a licensing regime would mean that research is less relevant and useful to the concerns of the day.
Yeah, I really don’t get why this is news I have to keep hearing about.
Have you read this article by Cory Doctorow yet?
He blames Monster Hunter being available on PC for cheating, but Monster Hunter has always had tools and cheats. An absolute trash take.
I can’t tell if this is a joke or not.