> On Sun, Jun 07, 2020 at 10:47:41PM +0100, Luke Emmet wrote: > > If a client must not make subsequent network requests when interpreting a > > page, does this mean that search engines and crawlers are now non-compliant > > clients? This seems to go much too far. The spec says > Clients can present links to users in whatever fashion the client author wishes, however clients MUST NOT automatically make any network connections as part of displaying links whose scheme corresponds to a network protocol (e.g. gemini://, gopher://, https://, ftp://, etc.). I find this reasonable: a crawler does not make any extra network connections *when interpreting a page* or *as part of displaying links*. Rather, it fetches single pages per spec, while building a graph of all known pages (which it then fetches, still as single pages in a way compatible with the spec). A crawler need not fetch any other pages in order to add a single page to its index. If a search engine started supporting inlining content from links it would be breaking the spec. My two cents. -Hannu -------------- next part -------------- An HTML attachment was scrubbed... URL: <https://lists.orbitalfox.eu/archives/gemini/attachments/20200608/0890 33a4/attachment.htm>
---
Previous in thread (18 of 35): 🗣️ Frank LENORMAND (lenormfml (a) gmail.com)
Next in thread (20 of 35): 🗣️ defdefred (defdefred (a) protonmail.com)