shaman1093@lemmy.mltoTechnology@lemmy.world•ChatGPT has meltdown and starts sending alarming messages to usersEnglish
53·
9 months agoThe person that commented below kinda has a point. While I agree that there’s nothing special about LLMs an argument can be made that consciousness (or maybe more ego?) is in itself an emergent mechanism that works to keep itself in predictable patterns to perpetuate survival.
Point being that being able to predict outcomes is a cornerstone of current intelligence (socially, emotionally and scientifically speaking).
If you were to say that LLMs are unintelligible as they operate to provide the most likely and therefore most predictable outcome then I’d agree completely.
^this - why is it so hard to implement sigh