There are lots of articles about bad use cases of ChatGPT that Google already provided for decades.
Want to get bad medical advice for the weird pain in your belly? Google can tell you it’s cancer, no problem.
Do you want to know how to make drugs without a lab? Google even gives you links to stores where you can buy the materials for it.
Want some racism/misogyny/other evil content? Google is your ever helpful friend and garbage dump.
What’s the difference apart from ChatGPT’s inability to link to existing sources?
Yeah, like that one in this thread who uses an LLM in Python to perform trivial tasks. They write a function with an LLM prompt as the body and then it gets executed by the LLM at runtime. Python was apparently not inefficient enough.
I don’t know, if I need a trivial function I just code it. Then I know it works and performs in a decent time.
I mean, that use case is definitely cool, but there is no way this should ever be used in production code.
It’s kinda like using dynamite as a novel heating system.