Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
|
|

Portable LLMs with llamafile

Portable LLMs with llamafile

Posted May 15, 2024 9:59 UTC (Wed) by snajpa (subscriber, #73467)
In reply to: Portable LLMs with llamafile by snajpa
Parent article: Portable LLMs with llamafile

Btw @ single exec file, this has been a done thing now for a while also. There's ollama, which also packs it all (and if it doesn't support all three platforms it's certainly their goal), but unlike _this_ llama.cpp fork, that project has a real added value. It really makes running LLMs easy for people, it abstracts the llama.cpp's rough edges to present a smooth workflows for people who never touched any of this stuff. It's also original code, which *includes* llama.cpp, rather than just "rebrands" it.


to post comments

Portable LLMs with llamafile

Posted May 15, 2024 14:47 UTC (Wed) by daroc (editor, #160859) [Link]

I apologize if I gave the impression in my article that llamafile is only a rebranded llama.cpp — llamafile has a bunch of additional code (under a different license, even) that wraps llama.cpp. See this part of the source. The project has both a copy of llama.cpp which sends patches upstream, and support code which is used to produce the final binaries.


Copyright © 2024, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds