The company’s shoddy opsec doesn’t directly equate to the model’s cabapilities. I am not one to believe anyone’s hype, but I am not one to believe the AI anti-hype that goes on throughout Lemmy. A year ago, according to Lemmy, LLMs could never produce working code at scale. 6 months ago, according to Lemmy, LLMs could never produce working code that was secure enough to use in production. Now, Lemmy believes LLM can’t be disruptive to cybersecurity as a whole.
In 6 months I wonder what Lemmy will claim LLMs aren’t capable of.
The company’s shoddy opsec doesn’t directly equate to the model’s cabapilities. I am not one to believe anyone’s hype, but I am not one to believe the AI anti-hype that goes on throughout Lemmy. A year ago, according to Lemmy, LLMs could never produce working code at scale. 6 months ago, according to Lemmy, LLMs could never produce working code that was secure enough to use in production. Now, Lemmy believes LLM can’t be disruptive to cybersecurity as a whole.
In 6 months I wonder what Lemmy will claim LLMs aren’t capable of.
Yeah this is very linear. Just because something sucks in someways doesn’t make in wholly incapable of other things.
BUT THESE ARE THEFT BOTS !!!111!!!111 THeY aRe thE ReASon NOboDy waNtS tO pAy FoR mY FuRry POrN ART !1!!!1!11!