Just a guy shilling for gun ownership, tech privacy, and trans rights.

I’m open for chats on mastodon https://hachyderm.io/

my blog: thinkstoomuch.net

My email: nags@thinkstoomuch.net

Always looking for penpals!

  • 2 Posts
  • 8 Comments
Joined 1 year ago
cake
Cake day: December 21st, 2023

help-circle

  • For simply productivity like Copilot or Text Gen like ChatGPT.

    It absolutely is doable on a local GPU.

    Source: I do it.

    Sure I can’t do auto running simulations to find new drugs and protein sequencing or whatever. But it helps me code. It helps me digest software manuals. That’s honestly all I want

    Also, massive compute projects for the @home project are good?

    Local LLMs runs fine on a 5 year old GPU, a 3060 12 gig. I am getting performance on par with cloud ran models. I’m upgrading to a 5060ti just because I wanted to play with image Gen.




  • nagaram@startrek.websitetoLefty Memes@lemmy.dbzer0.comOpen Source washing
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    5 days ago

    Which is funny since that does solve a lot of the problems.

    If it’s completely open source at least.

    Like OS data sets and model that can be ran locally means it’s not trained on stolen data and it’s not spying on people for more data.

    And if it runs locally on a GPU, it’s no worse for the environment than gaming. Really the big problem with the data center compute is the infrastructure of getting that data around.





  • I’m a fan generally of LLMs for work, but only if you’re already an expert or well versed at all in whatever you’re doing with the model because it isn’t trust worthy.

    If you’re using a model to code you better already know how that language works and how to debug it because the AI will just lie.

    If you need it to make an SOP then you better already have an idea for what that operation looks like because it will just lie.

    It speeds up the work process by instantly doing the tedious parts of jobs, but it’s worthless if you can’t verify the accuracy. And I’m worried people don’t care about the accuracy.