☆ Yσɠƚԋσʂ ☆@lemmygrad.ml to technology@hexbear.netEnglish · 3 months agoGPT-5: Overdue, overhyped and underwhelming. And that’s not the worst of it.garymarcus.substack.comexternal-linkmessage-square87linkfedilinkarrow-up1101arrow-down11cross-posted to: technology@lemmy.ml
arrow-up1100arrow-down1external-linkGPT-5: Overdue, overhyped and underwhelming. And that’s not the worst of it.garymarcus.substack.com☆ Yσɠƚԋσʂ ☆@lemmygrad.ml to technology@hexbear.netEnglish · 3 months agomessage-square87linkfedilinkcross-posted to: technology@lemmy.ml
minus-square☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOPlinkfedilinkEnglisharrow-up1·3 months agoWe can look at examples of video generating models. I’d argue they have to have a meaningful and persistent representation of the world internally. Consider something like Genie as an example https://deepmind.google/discover/blog/genie-3-a-new-frontier-for-world-models/ It doesn’t have volition, but it does have intelligence in the domain of creating consistent simulations. So, it does seem like you can get a domain specific intelligence through reinforcement training.
We can look at examples of video generating models. I’d argue they have to have a meaningful and persistent representation of the world internally. Consider something like Genie as an example https://deepmind.google/discover/blog/genie-3-a-new-frontier-for-world-models/
It doesn’t have volition, but it does have intelligence in the domain of creating consistent simulations. So, it does seem like you can get a domain specific intelligence through reinforcement training.