The thing is - and what’s also annoying me about the article - AI experts and computational linguistics know this. It’s just the laypeople that end up using (or promoting) these tools now that they’re public that don’t know what they’re talking about and project intelligence onto AI that isn’t there. The real hallucination problem isn’t with deep learning, it’s with the users.
Spot on. I work on AI and just tell people “Don’t worry, we’re not anywhere close to terminator or skynet or anything remotely close to that yet” I don’t know anyone that I work with that wouldn’t roll their eyes at most of these “articles” you’re talking about. It’s frustrating reading some of that crap lol.
The thing is - and what’s also annoying me about the article - AI experts and computational linguistics know this. It’s just the laypeople that end up using (or promoting) these tools now that they’re public that don’t know what they’re talking about and project intelligence onto AI that isn’t there. The real hallucination problem isn’t with deep learning, it’s with the users.
Spot on. I work on AI and just tell people “Don’t worry, we’re not anywhere close to terminator or skynet or anything remotely close to that yet” I don’t know anyone that I work with that wouldn’t roll their eyes at most of these “articles” you’re talking about. It’s frustrating reading some of that crap lol.
The article really isn’t about the hallucinations though. It’s about the impact of AI. its in the second half of the article.
I read the article yes