💾 Archived View for bbs.geminispace.org › u › corscada › 18490 captured on 2024-08-31 at 15:55:08. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2024-08-19)

➡️ Next capture (2024-12-17)

🚧 View Differences

-=-=-=-=-=-=-

Comment by 🚀 corscada

Re: "Cache for offline browsing"

In: s/Lagrange

Lagrange (and likely other browsers) already allow you to browse and view local .gmi files without needing a local server. in theory you just need a simple script that will crawl a gemfeed resource from any endpoints you've 'subscibed' to and structure the downloaded files sensibly.

I'd rather keep this a browser agnostic tool.

🚀 corscada

Jul 11 · 7 weeks ago

4 Later Comments ↓

💀 requiem · Jul 11 at 16:42:

@corscada exactly my thoughts!

👺 daruma [OP] · Jul 11 at 17:13:

@requiem check out offpunk.py from ploum it does that and you could prpbably use lagrange to browse offpunk.py cache.

👺 daruma [OP] · Jul 11 at 17:15:

this is the tool:

— ploum.be/software.gmi

🚀 decant_ · Jul 12 at 07:04:

offpunk is nice. But there are some site that also offers git access. as in, you git clone the site repo and read it locally. Other sites like spam.works hosts static historical content, this type of site might as well offer a tarball download.

Original Post

🌒 s/Lagrange

Cache for offline browsing — I really enjoyed the concept of offpunk, a gemini/gopher/html/wikipedia/rss browser aimed at caching everything you searched. All the links would be subscription(ish) and you could visit what is new and download it locally to read offline. I see that LaGrange has a maximum of 9gig of cache memory. How does that cache work? Can I browse a site when I am offline and it will show the cache? I see the subscription also in here, and will that sync to the cache? I haven'...

💬 daruma · 8 comments · Jul 11 · 7 weeks ago