๐Ÿ‘ฝ bengunn

Llamafile para testear en local LLMs

1. Download llava-v1.5-7b-q4-server.llamafile (3.97 GB).

2. Open your computer's terminal.

3. If you're using macOS, Linux, or BSD, you'll need to grant permission for your computer to execute this new file; chmod +x llava-v1.5-7b-q4-server.llamafile

4. If you're on Windows, rename the file by adding ".exe" on the end.

5.Run the llamafile. e.g.: ./llava-v1.5-7b-q4-server.llamafile

6. Your browser should open automatically and display a chat interface. (If it doesn't, just open your browser and point it at https://localhost:8080.)

Fuentes: https://github.com/Mozilla-Ocho/llamafile

pin: victorchk (gemini://station.martinrue.com/victorhck)

5 months ago ยท ๐Ÿ‘ calimero

Links

[1] https://localhost:8080

[2] https://github.com/Mozilla-Ocho/llamafile

[3] gemini://station.martinrue.com/victorhck

Actions

๐Ÿ‘‹ Join Station