This is like the definition of a “conservative”. Progress shouldn’t happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn’t be allowed.
Most games have the ability to downscale so that people like this can still play. We don’t stop all progress just because some people aren’t comfortable with it. You learn to adjust or catch up.
More “conservative” in terms of preserving the planet’s resources.
You don’t need Gigabytes of RAM for almost any consumer application, as long as the programming team was interested/incentivized to write quality software.
It’s not really about comfort when you buy software and it doesn’t work unless you also buy an $800 hardware upgrade. Especially when it worked fine on the previous version and the only difference is the addition of extraneous features.
I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that’s ridiculous.
On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it’s not a problem to have those features.
less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.
I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.
I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.
I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.
whose throwing away stuff every six months, hardware cycles arent even remotely that short, hell, moores law was never that short in the existence of said law. and its not like I dont have my fair share of preventing hardware waste (my litteral job is the refurbishing and resell of computer hardware, im legitimately doing more than the averge person and trying to maintain older hardware several fold). But its not my job to dictate what is fun and whats not. whats fun for you isnt exactly everyone elses definition of fun.
It’s the opposite. Limitations foster creativity. Those old computers and game consoles could do amazing things when people wanted to do something. Now you didn’t have to think about what you’re doing, just expect the user to have high end equipment and a super high speed Internet connection. It’s the equivalent to saying you need a trophy truck in order to go over the road you just built because it’s too shitty for a regular car to drive on.
100% agree. But there’s no reason to limit innovation because some people can’t take advantage of it. Just like we shouldn’t force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000’s…that’s two decades ago.
I’m still gaming on my desktop that I built 10 years ago quite comfortably.
Somebody didn’t live though the “Morrowind on Xbox” era where “creativity” meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.
But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn’t remove them from the game, creating huge bloat in your save file.
The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware’s veins. It is not innovation. It is simply a waste of precious resources.
The original post includes two gaming examples, so it’s actually about both, which is a bit unfortunate, because as you’ve said, they’re two very different things.
I suppose ray-tracing is rather suggestive of games, you’re right. Well, I’ll take it as an accident by the author and rest easy. Thanks for the correction!
This is like the definition of a “conservative”. Progress shouldn’t happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn’t be allowed.
Most games have the ability to downscale so that people like this can still play. We don’t stop all progress just because some people aren’t comfortable with it. You learn to adjust or catch up.
More “conservative” in terms of preserving the planet’s resources.
You don’t need Gigabytes of RAM for almost any consumer application, as long as the programming team was interested/incentivized to write quality software.
It’s not really about comfort when you buy software and it doesn’t work unless you also buy an $800 hardware upgrade. Especially when it worked fine on the previous version and the only difference is the addition of extraneous features.
I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that’s ridiculous.
On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it’s not a problem to have those features.
It’s conservationist, reducing hardware requirements to lengthen the lifetime of old hardware.
less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.
I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.
I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.
I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.
whose throwing away stuff every six months, hardware cycles arent even remotely that short, hell, moores law was never that short in the existence of said law. and its not like I dont have my fair share of preventing hardware waste (my litteral job is the refurbishing and resell of computer hardware, im legitimately doing more than the averge person and trying to maintain older hardware several fold). But its not my job to dictate what is fun and whats not. whats fun for you isnt exactly everyone elses definition of fun.
Fuckhuge trucks that roll coal are fun for some people too, but fuck em.
If they can downscale enough, they should be able to pass this test.
Honestly we are hitting the bugetary limits of what game graphics can do, for example.
A lot of new games look substantially worse than the Last of Us Part 2, which ran on ancient hardware.
One could point to the inclusive or environmental aspect to this approach.
It’s the opposite. Limitations foster creativity. Those old computers and game consoles could do amazing things when people wanted to do something. Now you didn’t have to think about what you’re doing, just expect the user to have high end equipment and a super high speed Internet connection. It’s the equivalent to saying you need a trophy truck in order to go over the road you just built because it’s too shitty for a regular car to drive on.
“Limitations foster creativity.”
100% agree. But there’s no reason to limit innovation because some people can’t take advantage of it. Just like we shouldn’t force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000’s…that’s two decades ago.
I’m still gaming on my desktop that I built 10 years ago quite comfortably.
Somebody didn’t live though the “Morrowind on Xbox” era where “creativity” meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.
But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn’t remove them from the game, creating huge bloat in your save file.
Not all creativity is good creativity.
The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware’s veins. It is not innovation. It is simply a waste of precious resources.
The original post includes two gaming examples, so it’s actually about both, which is a bit unfortunate, because as you’ve said, they’re two very different things.
I suppose ray-tracing is rather suggestive of games, you’re right. Well, I’ll take it as an accident by the author and rest easy. Thanks for the correction!