I did a prototype implementation of a “network of ML networks” - an internet-like protocol for federated learning where nodes can discover, join, and migrate between different learning groups based on performance metrics (Repo: https://github.com/bluebbberry/MyceliumNetServer). It’s build on Flower AI.

Want do you think of this? It could be used to build a Napster/BitTorrent-like app on this to collaboratively train and share arbitrary machine learning models with other people while keeping data private and only sharing gradients instead of whole models to save bandwidth. Would this be a good counter-weight for big AI companies or actually make things worse?

Would love to hear your opinion ;)

  • iii@mander.xyz
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    Can you help me understand the use case?

    Is it intended to be a variant of Flower AI that can be used in an adversarial environment?

  • Ebby@lemmy.ssba.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    1 day ago

    Since downloading copyright material is legal for training apparently, I’d be glad to help train a privacy respecting distributed LLM. 😉

    /s of course. But you have a very interesting idea!

    • squaresinger@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 hours ago

      Btw, would it be legal to use a torrent client that uses an LLM to make up the outgoing packets so that you aren’t sending copyrighted material? ;)

  • General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    What would a use case look like?

    I assume that the latency will make it impractical to train something that’s LLM-sized. But even for something small, wouldn’t a data center be more efficient?