๐พ Archived View for bbs.geminispace.org โบ u โบ corscada โบ 18490 captured on 2024-08-19 at 01:03:56. Gemini links have been rewritten to link to archived content
โก๏ธ Next capture (2024-08-31)
-=-=-=-=-=-=-
Re: "Cache for offline browsing"
Lagrange (and likely other browsers) already allow you to browse and view local .gmi files without needing a local server. in theory you just need a simple script that will crawl a gemfeed resource from any endpoints you've 'subscibed' to and structure the downloaded files sensibly.
I'd rather keep this a browser agnostic tool.
Jul 11 ยท 6 weeks ago
๐ requiem ยท Jul 11 at 16:42:
@corscada exactly my thoughts!
๐บ daruma [OP] ยท Jul 11 at 17:13:
@requiem check out offpunk.py from ploum it does that and you could prpbably use lagrange to browse offpunk.py cache.
๐บ daruma [OP] ยท Jul 11 at 17:15:
this is the tool:
๐ decant_ ยท Jul 12 at 07:04:
offpunk is nice. But there are some site that also offers git access. as in, you git clone the site repo and read it locally. Other sites like spam.works hosts static historical content, this type of site might as well offer a tarball download.
Cache for offline browsing โ I really enjoyed the concept of offpunk, a gemini/gopher/html/wikipedia/rss browser aimed at caching everything you searched. All the links would be subscription(ish) and you could visit what is new and download it locally to read offline. I see that LaGrange has a maximum of 9gig of cache memory. How does that cache work? Can I browse a site when I am offline and it will show the cache? I see the subscription also in here, and will that sync to the cache? I haven'...