In an interview with Rolling Stone, Scott, who has directed several movies featuring AI, was asked if the technology worried him. He says he's always believed the...
Blade Runner director Ridley Scott calls AI a “technical hydrogen bomb” | “we are all completely f**ked”::undefined
Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.
That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.
I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.
I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.
Even though you are technically correct, you assume people who are in charge of making decisions have the same insight and knowledge you do about the current limitations of gen ai.
I absolutely assure you that senior managers think it is fully matured since it gives convincing answers and they have made permanent and expensive decisions based off of this viewpoint. To them, it fully replaces UX/UI and developers. So they have made cuts. We’re currently sourcing some offshore help to fix our customer service chatbot which keeps giving off-topic advice to users 🤪
Oh, 100 percent right you are.
Definitely not saying clueless corporate idiot bosses aren’t going to try and replace their workforce with AI.
But I am saying that it won’t work for them after they do that. They’re going to crash and burn here, and have lost that talent and expertise within their company so there’s no replacing it, except slowly over time.
From personal experience I think they’ll keep doubling down and when that doesn’t prove successful, lobby governments to make changes or ask for bailouts.
My company (along with a whole onslaught of other similar orgs) successfully lobbied local politicians who convinced the mayor to pass a major bylaw that changed zoning rules and effectively killed remote work in my area.
It’s depressing how right you probably are about how companies are going to cope with this.
Reminds me of that quote:
“If Conservatives become convinced that they cannot win democratically, they will not abandon conservatism. They will reject Democracy.”
But, like, apply that to Capitalism and Capitalists rejecting Capitalism in favor of Socialism for them.
I know very well what UX is having studied it as my major in uni. Senior executives do not know what it is and have and are making decisions to “replace” them with LLMs and “prompt engineers”. I see it daily at work.
There is a great disconnect where hiring managers and executives see LLMs as a quick win that will cut costs and make moves to cut costs without doing any analysis.
Mm, I’ve already seen marketers present outputs from GPT models as if it’s useful customer feedback. My suspicion is this bubble will burst though, because at some point it will become clear that they are not as good as what they’re doing as execs have been told they are.
I can tell you now that AI won’t come for UX/UI teams, at least not in the near future. Clients rarely are able to really articulate what they need out of software and until AI is smart enough to suss that out, we’re good. That being said, I’m sure there will be companies that try to go that route but I doubt it will work, again, in the near term.
I’m not saying that AI will properly come for UX/UI teams.
It already is. AI is as you said not smart enough to evenly replace UX/UI teams, but managers and executives and csuite individuals don’t understand that. AI has been sold to them as a quick win that lowers costs. To give you an example, 3 members of our CX team were replaced by an annual license to Enterprise GPT-4 and some custom training for business stuff. In the last 2 months so much has broken down with it/hasn’t worked well and clients complained so now we are subcontracting a Bangalore firm to try and fix it. Pretty sure we’ve exceeded those 3 people’s salary costs by now.
Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.
That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.
I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.
I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.
We’re a long way out from that fortunately.
Not saying that some jobs won’t be cut/lost, but the companies doing that were likely looking for reasons to downsize.
AI models do not replace competent UI/UX. That’s just not what they’re designed to do. Very different functions.
Even though you are technically correct, you assume people who are in charge of making decisions have the same insight and knowledge you do about the current limitations of gen ai.
I absolutely assure you that senior managers think it is fully matured since it gives convincing answers and they have made permanent and expensive decisions based off of this viewpoint. To them, it fully replaces UX/UI and developers. So they have made cuts. We’re currently sourcing some offshore help to fix our customer service chatbot which keeps giving off-topic advice to users 🤪
Oh, 100 percent right you are. Definitely not saying clueless corporate idiot bosses aren’t going to try and replace their workforce with AI.
But I am saying that it won’t work for them after they do that. They’re going to crash and burn here, and have lost that talent and expertise within their company so there’s no replacing it, except slowly over time.
From personal experience I think they’ll keep doubling down and when that doesn’t prove successful, lobby governments to make changes or ask for bailouts.
My company (along with a whole onslaught of other similar orgs) successfully lobbied local politicians who convinced the mayor to pass a major bylaw that changed zoning rules and effectively killed remote work in my area.
It’s depressing how right you probably are about how companies are going to cope with this.
Reminds me of that quote: “If Conservatives become convinced that they cannot win democratically, they will not abandon conservatism. They will reject Democracy.”
But, like, apply that to Capitalism and Capitalists rejecting Capitalism in favor of Socialism for them.
Ux is not about drawing pictures. That work is already automated by ui kits anyway. Ux is about thinking through requirements and research.
I know very well what UX is having studied it as my major in uni. Senior executives do not know what it is and have and are making decisions to “replace” them with LLMs and “prompt engineers”. I see it daily at work.
There is a great disconnect where hiring managers and executives see LLMs as a quick win that will cut costs and make moves to cut costs without doing any analysis.
Mm, I’ve already seen marketers present outputs from GPT models as if it’s useful customer feedback. My suspicion is this bubble will burst though, because at some point it will become clear that they are not as good as what they’re doing as execs have been told they are.
Perhaps but the egos on “decision makers” are so large that I see them doubling down until the end.
If shareholders’ profits are affected then so will the decisions lol
At the end of the day they’re still TPS reports. I’m afraid the only bubble that’s gonna burst is yours.
Suits are idiots. No argument there.
I can tell you now that AI won’t come for UX/UI teams, at least not in the near future. Clients rarely are able to really articulate what they need out of software and until AI is smart enough to suss that out, we’re good. That being said, I’m sure there will be companies that try to go that route but I doubt it will work, again, in the near term.
I’m not saying that AI will properly come for UX/UI teams.
It already is. AI is as you said not smart enough to evenly replace UX/UI teams, but managers and executives and csuite individuals don’t understand that. AI has been sold to them as a quick win that lowers costs. To give you an example, 3 members of our CX team were replaced by an annual license to Enterprise GPT-4 and some custom training for business stuff. In the last 2 months so much has broken down with it/hasn’t worked well and clients complained so now we are subcontracting a Bangalore firm to try and fix it. Pretty sure we’ve exceeded those 3 people’s salary costs by now.