gemini://idiomdrottning.org/the-dystopia-of-web-only-documentation
Once upon a midnight dreary in December 2023 I came across this 2014 post by ESR about his plans to move FSF documentation to the web, with the final two steps being
They'll probably end up moving some of the docs yet again when the next great thing comes along. Obviously what documentation needs is a nice Cloud-Integrated ECMAScript Application (CIEA) to make it even more difficult, expensive, and dangerous (DED) to access.
"javascript required to view this site"
Would you like to hear about our WASM pairings?
"Some time ago I added to Linux man(1) the capability to recognize HTML pages in the man hierachy and kick them over to the user’s Web browser. All Linux and *BSD distributions now ship this code." -- Eric S. Raymond
$ man ./MIDI.html | sed 4q () () <HTML> <HEAD> <META NAME="description" CONTENT="MIDI File Format Spec. 1.1+"> <META NAME="keywords" CONTENT="MIDI, file, format, specification, updated">
Maybe OpenBSD 7.4 isn't a *BSD, or the capability is well hidden (I admittedly didn't root around under /usr/src/usr.bin/mandoc for very long, having not seen anything obvious in the manual), or "ship this code" may mean something like "something optional from somewhere in the ports tree".
I ended up writing my own man page viewer on Mac OS X as the system or even some MacPorts version would take upwards of 200 to 300 milliseconds to render a man page, or about an order of magnitude slower than a Perl script that did the needful, to say less of the subsequent C script. If your compiled man(1) program is much slower than a Perl script, you probably took a wrong turn somewhere. Maybe step away from the editor and stop adding features?
Applications that summon a bloat browser get reconfigured to not do that. mupdf comes to mind.
set env(BROWSER) log-url solitary / mupdf -r $resolution $file
Where log-url is a script that appends the url to a particular file.
"Kill off info. Where we’re probably going is that (a) info will die, to be replaced by HTML browsed from within Emacs, and (b) Texinfo will be replaced by a modern lightweight format that can render to both print and HTML; most likely asciidoc." -- Eric S. Raymond
I heard a story about Debian apparently porting docs back to man pages so there would be consistent system-level documentation. Documentation does not move as fast as javascript frameworks, but there has been some churn, with man pages and info pages and bitrotting HOWTOs from 2001 ("my god, it's full of <BR/>s") and javascript-laced wikis and ...
Texinfo in particular is structured, semantic, accessible, and has a glorious legacy of being one of the very first public hypertexts.
Search for man pages that mention the variables "optind" and "optarg" (on OpenBSD, anyways).
$ apropos Va=optind -a Va=optarg getopt(3) - get option character from command line argument list getopt_long, getopt_long_only(3) - get long options from command line argument list
Allowing blobs of HTML under MANPATH might make that sort of selection difficult.
As much as I hate the web, being able to search up documentation on the web and read it right there is a good thing. Not as much in practice, but as a just-in-case if the web somehow becomes good again. For that matter, using Gemini for this might make a lot more sense.
One problem is when the web documentation is for some other version than what the local system has, and then time is wasted or bugs written. This is more of a problem for long term support releases. The web can then be complicated by having a "what version do you want?" selector, which brings both good, and bad. I will admit to sometimes using the web when I want to peek at linux man pages, but that's not for serious use these days.
$ ow -l linuxman 1 ls https://linux.die.net/man/1/ls
Another problem is when there's often repeated documentation hordes (RFC, man pages, etc) that pollute the search results with pages and pages of the same not-what-I-want hits. This is a bit of "the bad" from hosting documentation for multiple versions in multiple places, especially when there's a dogpile for ad income.
'(Common LISP packages can be terrible on the documentation front, but that's a different rant. Shout out to SDL's "read the *.h files, lol" as well. Could you at least show how to put some of those *.h entries together? Or are there tests that show how to use it?)
On the plus side, a broken page (activex, flash, mandatory javascript, etc) is a good sign that the project will not be a good fit. DED pages:
P.S. My w3m user-agent string is now the entire "Ozymandias" by Shelley, as Duck Duck Go started making stinky faces about the prior "MSIE 8/Windows 8" agent. Look on my user-agent, ye Mighty, and despair.