๐Ÿ’พ Archived View for station.martinrue.com โ€บ bengunn โ€บ 9afda7095b114e718b23bfc7ea600db3 captured on 2024-08-31 at 14:01:14. Gemini links have been rewritten to link to archived content

View Raw

More Information

โฌ…๏ธ Previous capture (2024-08-18)

-=-=-=-=-=-=-

๐Ÿ‘ฝ bengunn

Llamafile para testear en local LLMs

1. Download llava-v1.5-7b-q4-server.llamafile (3.97 GB).

2. Open your computer's terminal.

3. If you're using macOS, Linux, or BSD, you'll need to grant permission for your computer to execute this new file; chmod +x llava-v1.5-7b-q4-server.llamafile

4. If you're on Windows, rename the file by adding ".exe" on the end.

5.Run the llamafile. e.g.: ./llava-v1.5-7b-q4-server.llamafile

6. Your browser should open automatically and display a chat interface. (If it doesn't, just open your browser and point it at https://localhost:8080.)

Fuentes: https://github.com/Mozilla-Ocho/llamafile

pin: victorchk (gemini://station.martinrue.com/victorhck)

8 months ago ยท ๐Ÿ‘ calimero

Links

https://localhost:8080

https://github.com/Mozilla-Ocho/llamafile

gemini://station.martinrue.com/victorhck

Actions

๐Ÿ‘‹ Join Station