Pro to Technology@lemmy.worldEnglish • 20 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square48fedilinkarrow-up1242arrow-down126
arrow-up1216arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro to Technology@lemmy.worldEnglish • 20 days agomessage-square48fedilink
minus-squareGreg ClarkelinkfedilinkEnglish3•20 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-square@Euphoma@lemmy.mllinkfedilinkEnglish7•19 days agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk