๐Ÿ’พ Archived View for station.martinrue.com โ€บ bengunn โ€บ 9afda7095b114e718b23bfc7ea600db3 captured on 2024-05-12 at 17:17:02. Gemini links have been rewritten to link to archived content

View Raw

More Information

โฌ…๏ธ Previous capture (2024-05-10)

โžก๏ธ Next capture (2024-06-16)

๐Ÿšง View Differences

-=-=-=-=-=-=-

๐Ÿ‘ฝ bengunn

Llamafile para testear en local LLMs

1. Download llava-v1.5-7b-q4-server.llamafile (3.97 GB).

2. Open your computer's terminal.

3. If you're using macOS, Linux, or BSD, you'll need to grant permission for your computer to execute this new file; chmod +x llava-v1.5-7b-q4-server.llamafile

4. If you're on Windows, rename the file by adding ".exe" on the end.

5.Run the llamafile. e.g.: ./llava-v1.5-7b-q4-server.llamafile

6. Your browser should open automatically and display a chat interface. (If it doesn't, just open your browser and point it at https://localhost:8080.)

Fuentes: https://github.com/Mozilla-Ocho/llamafile

pin: victorchk (gemini://station.martinrue.com/victorhck)

5 months ago ยท ๐Ÿ‘ calimero

Links

https://localhost:8080

https://github.com/Mozilla-Ocho/llamafile

gemini://station.martinrue.com/victorhck

Actions

๐Ÿ‘‹ Join Station