If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?

Thanks for any recommendations in advance.

  • @net00@lemmy.today
    link
    fedilink
    11 day ago

    I don’t recommend it. I ran local AI in my phone before (iPhone but same difference), and just asking it stuff makes it warm up to touch. Battery also takes a hit.

    It also messes up multitasking features since it uses up most memory which kills background apps. Phones weren’t designed for this.

    Best way is to host it in an actual dedicated machine that can be accessed remotely.

  • haui
    link
    fedilink
    93 days ago

    Running an llm on a phone will absolutely destroy your battery life. It also is imperative that you understand that the comfort of ai is bought with killing of innocents (through expediency of climate catastrophe, exploitation of the planet and the poorest on it).

    I think using ai to experiment on a home server which already exists wouldnt be problematic IN A VACUUM but you would still normalize using the tech which is morally corrupt.

      • @JandroDelSol@lemmy.world
        link
        fedilink
        32 days ago

        I mean, ideally yes, everyone would go vegan, but that’s a far bigger lifestyle change than just continuing to not use AI like we’ve done for decades

      • haui
        link
        fedilink
        -22 days ago

        Oh yeah. So i cant ask for one thing and not do another. Classic bad faith argument. Good try.

        • @Fisch@discuss.tchncs.de
          link
          fedilink
          English
          72 days ago

          I’m not even trying to argue against you, I’m arguing for veganism. The same arguments that you used for why the use of AI is bad can be used for why not being vegan is bad. The production of animal products even has a way bigger impact.

          • haui
            link
            fedilink
            12 days ago

            In that case i very much suggest a different approach.

            “If that is your take, you will love veganism.”

            Btw, i dont eat animal products. Pretty recently too.

            • haui
              link
              fedilink
              62 days ago

              It is not. Eating meat is actually a torture and murder industry and people who eat meat are actually eating pieces of corpses.

              Even when I still ate meat, I could understand that morally there is no difference to eating humans.

              • @N0x0n@lemmy.ml
                link
                fedilink
                12 days ago

                Not only that… The meat industry is poisoning the animals with antibiotics, GMO cereals… they live in 1m2 of space/cow, and living in hell 24/24, feel anger, hate, sadness and treated like shit and pure pain while beaten by humans… It’s horrible !!

                And all these things are digested by our body… Ugh ! It’s similar to a concentration camp but for animals…

                • @selokichtli@lemmy.ml
                  link
                  fedilink
                  02 days ago

                  I could see an argument against industry standards on meat production, but the base argument of veganism is flawed when all these things you list do not occur. And yes, veganism is frequently a cult in my experience. They tend to surround of themselves, assure themselves to each other and avoid discussion of their mindset. AAMOF I was turned off to discuss here, but as I randomly got to know I’m being downvoted, I will engage.

            • @HiddenLayer555@lemmy.ml
              link
              fedilink
              English
              2
              edit-2
              2 days ago

              Literally any lifestyle can become a cult. The presence of vegan cults is because veganism is more popular than ever so evil people exploit that to form cults. There are fitness cults too, does that mean everyone who works out is a cult member?

              • @selokichtli@lemmy.ml
                link
                fedilink
                22 days ago

                I assume you know what “frequently” means. Otherwise, do these lifestyles classify people in categorical terms as vegan/meateaters or morally right and wrong?

    • @nagaram@startrek.website
      link
      fedilink
      12 days ago

      I am a fan of LLMs and what they can do, and as such have a server specifically for running AI models. However, I’ve been reading “Atlas of AI” by Kate Crawford and you’re right. So much of the data that they’re trained on is inherently harmful or was taken without consent. Even in the more ethical data sets it’s probably not great considering the sheer quantity of data needed to make even a simple LLM.

      I still like using it for simple code generation (this is just a hobby to me so Vibe coding isn’t a problem in my scenario) and corporate tone policing. And I tell people non stop that it’s worthless outside of these use cases and maybe as a search engine, but I recommend Wikipedia as a better start almost Everytime.

  • Smee
    link
    fedilink
    62 days ago

    It very much depends on your phone hardware, RAM affects how big models can be and CPU affects how fast you’ll get the replies. I’ve successfully ran 4B models on my 8GB RAM phone, but since it’s the usual server and client setup which needs full internet access due to the lack of granular permissions on Android (Even AIO setups needs open ports to connect to itself) I prefer a proper home server. Which, with a cheap GFX card, is indescribably faster and more capable.

    • @nagaram@startrek.website
      link
      fedilink
      22 days ago

      I was honestly impressed with the speed and accuracy I was getting with Deepseek, llama, and Gemma on my 1660ti.

      $100 used and it was seconds to get responses.

  • Autonomous User
    link
    fedilink
    English
    2
    edit-2
    2 days ago

    maid + VPN to Ollama on your own computer.

    Use an Onion service with client authorisation to avoid needing a domain or static IP.

  • @absurdity_of_it_all@lemmy.ml
    link
    fedilink
    English
    -53 days ago

    You want to run it on the phone itself? I don’t think any phone would be good enough for that. The issue with AI assistants is not just privacy. It’s also the resource consumption (and of course stolen content). It’s so high and only these big companies with huge server farms can do it.

    If you just want a voice assistant for simple commands, I’ve heard of an open source local assistant called Dicio. But I don’t think you can talk to it like ChatGPT or something.

    • Smee
      link
      fedilink
      62 days ago

      I’ve successfully ran small scale LLM’s on my phone, slow but very doable. I run my main AI system on an older, midrange gaming PC. No problems at all.

      Dicio is a pre-programmed assistant, which one can talk to if one has speech recognition software installed. It has a preset of tasks it can do, in my experience it’s quite incomparable to how LLM’s work.

  • @Tinkerer@lemmy.ca
    link
    fedilink
    -62 days ago

    I have pocketpal setup on my pixel with graphene os and its pretty awesome. I agree that AI is inherently bad considering the environmental impact and the amount of data that is illegally needed to train AI.

    That being said pocketpal is openaoirce and is great. https://github.com/a-ghorbani/pocketpal-ai