๐Ÿ’พ Archived View for station.martinrue.com โ€บ bengunn โ€บ 9afda7095b114e718b23bfc7ea600db3 captured on 2024-03-21 at 16:10:57. Gemini links have been rewritten to link to archived content

View Raw

More Information

โžก๏ธ Next capture (2024-05-10)

-=-=-=-=-=-=-

๐Ÿ‘ฝ bengunn

Llamafile para testear en local LLMs

1. Download llava-v1.5-7b-q4-server.llamafile (3.97 GB).

2. Open your computer's terminal.

3. If you're using macOS, Linux, or BSD, you'll need to grant permission for your computer to execute this new file; chmod +x llava-v1.5-7b-q4-server.llamafile

4. If you're on Windows, rename the file by adding ".exe" on the end.

5.Run the llamafile. e.g.: ./llava-v1.5-7b-q4-server.llamafile

6. Your browser should open automatically and display a chat interface. (If it doesn't, just open your browser and point it at https://localhost:8080.)

Fuentes: https://github.com/Mozilla-Ocho/llamafile

pin: victorchk (gemini://station.martinrue.com/victorhck)

3 months ago ยท ๐Ÿ‘ calimero

Links

[1] https://localhost:8080

[2] https://github.com/Mozilla-Ocho/llamafile

[3] gemini://station.martinrue.com/victorhck

Actions

๐Ÿ‘‹ Join Station