The reality is, that most software is complex, but trivial. It’s s bunch of requirements, but there’s no magic behind it. An AI that can turn a written text containing the requirements into a decently running program will replace tons of developers.
And since a future AI, that’s actually trained to do software, won’t have problem juggling 300 requirements at once (like humans have), it’s relatively easy to trust the result.
I don’t want to hype AI, but you’re basically comparing a high school graduate AI (lots of general knowledge, no specialization) with a perfect senior dev. But that’s not really fair.
As soon as an AI works better than the average developer in a given area, it will outperform them. Simple as that.
Of course it will make errors, but the question is, are the extra errors compared to a human worth the savings?
Just a quick example: let’s say you’d need 10 devs of 100k a year and they produce errors worth 200k a year. That means costs of 1.2million a years.
If an AI costs 100k in licenses, replaces 5 devs and only adds, say 200k in errors, you’re still at only 1 million a year.
But those people don’t need to be programmers.
The reality is, that most software is complex, but trivial. It’s s bunch of requirements, but there’s no magic behind it. An AI that can turn a written text containing the requirements into a decently running program will replace tons of developers.
And since a future AI, that’s actually trained to do software, won’t have problem juggling 300 requirements at once (like humans have), it’s relatively easy to trust the result.
… just as easy as taking the responsibility for it if it fails?
Do human programmers not fail?
I don’t want to hype AI, but you’re basically comparing a high school graduate AI (lots of general knowledge, no specialization) with a perfect senior dev. But that’s not really fair.
As soon as an AI works better than the average developer in a given area, it will outperform them. Simple as that.
Of course it will make errors, but the question is, are the extra errors compared to a human worth the savings?
Just a quick example: let’s say you’d need 10 devs of 100k a year and they produce errors worth 200k a year. That means costs of 1.2million a years.
If an AI costs 100k in licenses, replaces 5 devs and only adds, say 200k in errors, you’re still at only 1 million a year.