fossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 2 days agoBeing Difficultmander.xyzimagemessage-square86linkfedilinkarrow-up11.51Karrow-down114
arrow-up11.5Karrow-down1imageBeing Difficultmander.xyzfossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 2 days agomessage-square86linkfedilink
minus-squareptu@sopuli.xyzlinkfedilinkEnglisharrow-up7arrow-down1·14 hours agoIt’s called the heuristic method and those doing it know the limitations. Whereas LLMs will just confidently put out garbage claiming it true.
minus-squareranzispa@mander.xyzlinkfedilinkEnglisharrow-up2·14 hours agoScientific calculations - and other approaches as well - put out garbage all the time, that is the main point of what I said above. Some limitations are known, just like it is known that LLMs have the limitation of hallucinating.
minus-squareptu@sopuli.xyzlinkfedilinkEnglisharrow-up2·11 hours agoI didn’t notice your critique on the outcome of results, but how they were achieved. LLM’s hallucinating are making computers make ”human errors”, which makes them less deterministic, the key reason I prefer doing some things on a computer.
It’s called the heuristic method and those doing it know the limitations. Whereas LLMs will just confidently put out garbage claiming it true.
Scientific calculations - and other approaches as well - put out garbage all the time, that is the main point of what I said above.
Some limitations are known, just like it is known that LLMs have the limitation of hallucinating.
I didn’t notice your critique on the outcome of results, but how they were achieved. LLM’s hallucinating are making computers make ”human errors”, which makes them less deterministic, the key reason I prefer doing some things on a computer.