When will web browsers be complete?

Author: fallat

Score: 142

Comments: 207

Date: 2020-10-28 06:29:27

Web Link

________________________________________________________________________________

chrismorgan wrote at 2020-10-28 12:07:54:

_So what is the Web? Well we can agree the Web is a conglomerate of standards proposed by the W3C._

No, I can’t agree with that because it hasn’t _ever_ been true. For its first decade, it _was_ the place where most relevant standards were proposed, but that still didn’t _define_ the web; you had to look at what browsers _did_ rather than what specs said, especially in such days as IE’s domination, because it varied quite substantially from the “standards”, and more importantly people _depended_ on these variations and extensions to the “standards”.

But then W3C tried to take the web further in a direction that the actual people with _power_ (browser manufacturers) didn’t like—more XML everywhere, and little to no activity on HTML itself—and so they split and formed WHATWG which has since taken over progressively more and more of the actual _important_ standards (e.g. HTML, DOM, URL, Fetch, Encoding), so that W3C itself is increasingly irrelevant, though there are still important things that are done through W3C, e.g. the CSS Working Group (CSSWG); and if you’re willing to stretch it, you could count the Web Incubator Community Group (WICG), but that’s more _hosted_ by W3C than _part of_ W3C.

Nowadays, if you say, “what is the web?”, you must include specifications from at least IETF (e.g. HTTP), WHATWG (e.g. HTML), W3C (e.g. CSS) and Ecma (e.g. JavaScript); but actual browser behaviour (an extremely vague concept) must also be considered too, because it has a big impact on what the web is.

pgt wrote at 2020-10-28 14:39:51:

Standards are just out-of-date documentation.

mandelbrotwurst wrote at 2020-10-28 19:05:59:

They're more like a proposed specification.

pgt wrote at 2020-10-28 22:47:46:

Standards are usually written after the implementation.

fergie wrote at 2020-10-28 14:03:22:

I have to confess that I didnt know anything about WHATWG until I read this comment- I always assumed that HTML was defined by W3C.

acoard wrote at 2020-10-28 15:48:21:

To be clear, initially HTML was defined by the W3C. Then when they tried to replace HTML with XML and the browser vendors revolted, the W3C essentially lost control over this.

Basically the W3C over-played it's hand. It thought by re-writing standards from HTML to XML it could force Apple/Opera/Mozilla to re-write. In the end, the browser vendors got together and created a new body that would be responsible for HTML. They've skirmished for control, but there's no question WHATWG and browser vendors are in ascendency. From 2007-2019 they both had competing standards of HTML, but since the browser vendors were on WHATWG they would just implement their own version making it the de-facto standard. So, in 2019 W3C announced they'd just use the WHATWG HTML.[0]

I expect if the W3C pushes anything the vendors consider too arduous (eg a re-write of CSS into constraints), then WHATWG would create a CSS working group and the process would repeat.

This reminds me a lot of the power of the Supreme Court, specifically President Johnson's quote about it when they made a ruling he didn't like. "[The Chief Justice on the court] has made his decision; now let him enforce it"[1] The stakeholders of WHATWG have an army (of browser coders), whereas W3C needs to rule by consent of the ruled like the Supreme Court does.

----

[0]

https://en.wikipedia.org/wiki/WHATWG

[1]

https://en.wikipedia.org/wiki/Worcester_v._Georgia#Enforceme...

chrisseaton wrote at 2020-10-28 14:34:11:

> I always assumed that HTML was defined by W3C

Yes I think that's the impression they try to create.

totallymike wrote at 2020-10-28 13:19:25:

Decent point. If my recollection is accurate, a number of browsers or other implementations must already implement something before it can _become_ standard.

madrox wrote at 2020-10-28 07:52:11:

When this essay started with "As someone who has used the Web since 2007" I immediately felt old, so thanks for that. As someone who has been using the Web since 1994 (and I feel like I was late to the game), I can assure the author their perspective has a lot of recency bias and this isn't something to worry about, in the general sense. Or worry about it, because it already happened a while ago. Take your pick.

Local operating systems, which this essay says is the gold standard of feature sets, are not standing still. If you think of how long it will take for browsers to reach current parity, go back that amount of time and see how far the OS has come.

But let's consider, for a moment, that browsers catch up. Let's just look at native development in general, since this has had access to all local OS features since before the web. Has that never changed? It's changing all the time. We're finding higher abstractions to work with. We're making it easier. I highly doubt Grid and Flexbox as they exist today will still be how web layout works in 20 years.

Google isn't even the first to scare the world. This essay could've been written in 2002...about Internet Explorer. Stop laughing. It held even more dominance than Chrome does today, and was closed source. Somehow we survived...which is good, because VRML sucked.

However, even with all that aside, people have been rolling their own browsers for years. People fork chromium today. The dark web exists. Maybe in 20 years lots of people will be streaming everything from a private cloud supercomputer vis-a-vis Stadia. The internet is no more homogeneous than all the people of the world. Everyone is finding new ways to access it all the time. This is why open standards are more important than browsers. It means anyone can build their own at any time based on the documentation and it can all read the same documents.

So I wouldn't worry about this coming to pass in quite the way you think. Though running your browser in a VM isn't the worst idea.

TeMPOraL wrote at 2020-10-28 10:29:28:

> _This essay could've been written in 2002...about Internet Explorer._

I've been reading a whitepaper from ~1997 on Microsoft's Distributed COM (DCOM) last night, and as I was reading it, I couldn't stop thinking: this is microservices. And it was really more than that - at least on the surface, this was microservices + orchestration + autoscaling + serverless, as well as, via Internet Explorer, component-based UIs with graceful degradation. That paper seems like a window to an alternate reality in which everything that's currently hot in tech has been done 20 years ago, except programming language-agnostic, using saner (i.e. binary) protocols, offline-first, and with advanced security built in from the start. I don't know why this is not our "real" reality, but I finally understand why MS was in a dominant position back then, and why IE was a hot thing.

EDIT: for the curious, the mentioned whitepaper is:

https://www.softwaretoolbox.com/dcom/DCOM%20Technical%20Over...

.

AnIdiotOnTheNet wrote at 2020-10-28 11:40:46:

> That paper seems like a window to an alternate reality in which everything that's currently hot in tech has been done 20 years ago, except programming language-agnostic, using saner (i.e. binary) protocols, offline-first, and with advanced security built in from the start.

Honestly this happens all the time. Not only do we seem to reinvent the wheel but we almost invariably reinvent it worse. Take 'dark mode' for instance. Everybody lauds that they can switch to a different theme that they feel is easier on their eyes, apparently oblivious to the fact that we could do that in Windows 95 with significantly greater and more granular control at the OS level. This industry is overrun with complexity fetishists who keep piling on more and more abstractions and calling it progress when they manage to catch up with the less-abstracted past.

hrktb wrote at 2020-10-28 12:15:45:

I think we should separate ideas/proof of concepts that were in the field and presented as having potential, and implementations that are largely adopted and reliable.

There will always be a lag between the two, and it won’t just be ‘reinventing the wheel’.

I had a look at CORBA and it was horrible in many ways. Interestingly it’s something that works and is proven at large scale but nobody would want to have it for small/middle scale non enterprisy applications, and it might not even be a good idea at large scale depending on what properties are valued. In no way I’d see kubernetes with services as a worse version of it, we didn’t choose it just by ignorance or hype.

Same with “dark mode“ on win 95. Most programs couldn’t even properly handle text sizes, it was a i18n nightmare that was just brushed under the carpet. Changing text and background colors was the same, with some apps somewhat working because they Mostly relied on default widgets, and a mess of unusable apps on the side that just had text colors baked. You had a better chance to have it work by plain inverting all colors through the accessibility settings.

To be fair it’s still partly a mess now, but we are in a way better situation that in Win95 , it is understood at an application level, it actually makes sense now.

pjc50 wrote at 2020-10-28 12:18:02:

You could have "dark mode" in _Windows 3.0_. Provided programs used the standard widgets and the correct "named" colours, everything would pick up the colour changes from the operating system.

AnIdiotOnTheNet wrote at 2020-10-28 13:05:26:

How on earth can anyone think that each application having to individually implement support for a bespoke "dark mode" is a better situation than having more granular controls in the OS and applications just follow the UI guidelines is beyond me. It's bullshit like that that makes me want to leave IT forever.

TeMPOraL wrote at 2020-10-28 13:18:30:

It's probably the unfortunate consequence of a typical software vendor's perspective.

There's a deep conflict between the interests of users and software vendors, a conflict for control over presentation. It's blindingly visible on the web, and half the reason it's in such shitty state it is, but it also applies to desktop software. The problem is that for a vendor, application's UI and UX are _marketing space_, and they want total control over "user's journey". For the user, UI/UX are things that stand between them and the work they want to get done, so they want it to be effort- and bullshit-free. For non-tech-savvy users this implies "consistent with all other software they use"; for power users, this means "able to be altered or removed" - but both of those viewpoints require vendor to not have the final say about presentation.

In context of dark mode, an OS-level control over UI styling is a user-friendly feature. Unfortunately, it got defeated by vendors who want the UI to be their branding space, and now they have to begrudgingly add dark mode themselves.

EDIT: see also

https://news.ycombinator.com/item?id=24918296

upthread, it seems to be a quite accurate description of what will happen if we let the vendor side win.

hrktb wrote at 2020-10-28 18:50:15:

It’s because reality is complicated.

Most people don’t want “dark mode” as a pure accessibility feature in the sense of “whatever color scheme is fine as long as it’s dark everywhere”. That’s the reverse color feature in most OS.

Instead I/they want dark themes but with pleasing colors, adjusted depending on the screens and the widgets, having sane defaults for content that is usually supposed to be white, with exceptions.

Yes, that’s a tall order, it’s context dependent, and we might actually want to decide app by app if the dev did a good job and/or if dark mode makes sense, and sometimes change the theme in the app because it’s otherwise not great.

We end up in a patchwork of apps following dark mode and others not, with different themes applied that might not be triggered by the system.

For instance I use youtube, reddit and editing apps in dark mode only, while the rest of the system is in light mode. As you say it could be handled by the system in a super granular way, but it would add that much more complexity, and I actually chose the exact themes to be used in most of these apps, so centralization wouldn’t help anyway.

gmueckl wrote at 2020-10-28 16:50:57:

How on earth is the same set of UI icons supposed to keep their readability against vastly different background colors? If you want to do a proper job of that, your application needs to become aware of dark mode anyway and ship with a separate set of icons.

Yes, in terms of raw flexibility exposed to the user, customizable color schemes look great on the surface. They work to some degree, but fail utterly for a wide range of cases in practice. Not having them is progress after a fashion because it means giving up on an flawed idea. I will admit, though, that it is not easy to see why it is flawed when you haven't tried to design custom controls that have to convey more state than can be expressed using the standard OS colir palette.

gmueckl wrote at 2020-10-28 12:06:53:

The problem with the fine grained user control over individual theme colors is that the OS theme is never complete enough for applications with visually complex UIs. At some point, applications have a need for extra colors to denote things in different ways. So they would need to provide their own configuration UI that allows the user to update these colors to be usable with any custom theme they're running. Unfortunately, most software developers never bothered to implement something like this and assumed the user would be running the default theme. So the users ended up with lots of broken UIs after trying to adjust the system wide theme.

znpy wrote at 2020-10-28 11:15:43:

A lot of stuff were "microservices" before the name was coined. SOA (as in service-oriented-architecture) was definitely a thing.

I worked for a company whose software was/is based on CORBA and let me tell you, that thing really smells like microservices-before-it-was-cool.

You had a naming service, and each component would connect to the naming service on startup and resolve the endpoint for the components it needed.

And each piece of software had a component called ORB (object-request-broker) that was used to represent remote stuff (remote objects/endpoints). You could invoke method calls on this object (kinda/sorta, i'm making it simpler here) and it would forward the call in background, hiding the networking/serializing/deserializing details.

That whole things kinda reminds me of protocolbuffers, etcd and microservices. By using a cluster manager (Veritas Cluster Manager) you had a service groups, something that looked like a pod, in a way.

-----------

It's been a while, but I've started thinking that there's nothing really new. The problems we're facing are pretty much always the same, it's just that the solutions improve on different areas. Kubernetes for example helps a lot on the scaling problem and the standardization (both ops team and dev team have a common lingo, the kubernetes object model -- pods, deployments, services, requests&limits etc).

EDIT:

Just skimmed the article... Yup, DCOM was a competing alternative to CORBA.

znpy wrote at 2020-10-28 11:17:04:

Interestingly enough, the "N" in "GNOME" was due to the fact that it was initially based on CORBA, and used a C-language ORB called orbit (IIRC).

If someone older than me would like to chime in and add some details, that would be lovely.

bluGill wrote at 2020-10-28 13:59:32:

KDE and GNOME agreed in the early days that both desktops would be based on COBRA and use standard object names so the two would be compatible.

After fighting COBRA for a while KDE said this is too complex/slow and wrote DCOM (I might remember this name wrong) which gave them the parts they wanted without the overhead. GNOME stuck with COBRA longer, but I get the sense that they half agreed and were just hoping they could find a solution without abandoning COBRA (GNOME also had more corporate backers even in those days so there was probably non-programmer architects demanding they make it work). Evenually both agreed that they needed something more lightweight to solve the desktop needs and DBus came along.

znpy wrote at 2020-10-28 14:08:52:

CORBA, not _COBRA_ :)

Also, iirc, KDE's thing was named DCOP

mattl wrote at 2020-10-28 15:00:22:

The issue with KDE was that Qt was nonfree so there was also a few efforts to replace Qt with something else.

bluGill wrote at 2020-10-28 18:34:42:

That is why gnome existed (now it is large enough to have a place), but a completely different issue from the corba one.

mattl wrote at 2020-10-28 19:06:05:

GNU Harmony and GNU NOME are two different projects from around the same time.

projektfu wrote at 2020-10-28 11:10:34:

The thing holding DCOM back was that the protocol was closed source and unspecified. It’s very difficult to make Microsoft’s version of DCE RPC work with other implementations let alone their proprietary object extensions.

If they had wanted interoperability, they could have implemented something more standards based as IBM did with SOM. But requiring Windows for all parts of distributed systems is very limiting.

madrox wrote at 2020-10-28 11:32:06:

Visual Studio of 20 years ago was in many ways superior to Xcode of today, but you need Xcode to make iOS apps, and iPhones are cool. Moral of the story is better technology does not always win. It’s a fallacy that technology is a continuous positive progression.

mdip wrote at 2020-10-28 15:37:17:

So I say this having used XCode about 10 years ago, but is XCode really _that_ bad?

Around 2000 or so, I was installing the Visual Studio -- Pre .NET Version -- on a number of new developer workstations. I was not a Helpdesk or Desktop tech, but worked supporting servers -- they sent me over to do the installs because invariably "something wouldn't launch". Different IDEs were used depending on what language you were writing in with this monster of an app that was used to write programs in Visual Basic[0]. And the program was 5-10 times larger than the largest installed application and came with 2 CD-ROMs worth of documentation that needed to be installed locally[1]. I remember I'd start all of the installs at the same time because they took several hours to complete.

[0] If it isn't obvious, this was a one-off job that I don't recall very much about. I didn't start using a Microsoft IDE until the first release of Visual Studio .NET (retroactively renamed "Visual Studio .NET 2002", I think).

[1] For the younger among us, that's about 1.5GB in a world where two years earlier 16GB of HDD was very expensive.

pjc50 wrote at 2020-10-28 12:16:20:

It may have been language-agnostic, but it was never going to be OS-agnostic. Like everything Microsoft it only worked with Microsoft implementations on both ends.

> saner (i.e. binary) protocols

.. which only had one canonical implementation on each end.

If I remember rightly, the protocol for accessing Exchange from Outlook was MAPI over either DCOM or DCE/RPC. Despite some reverse engineering efforts, this is why there were no Open Source drop-in replacements for Exchange, and clients accessing Exchange always did so over IMAP, thereby losing features.

Similarly SAMBA is DCE/RPC.

https://wiki.samba.org/index.php/DCERPC

SCHiM wrote at 2020-10-28 10:55:14:

This still exists! COM serves as the foundation of modern windows apps you currently see on Windows 10 (WinRT). And still actively developed in its form as WinRT.

COM sadly has a number of unfortunate characteristics though: it's quite complex to learn and it's not very secure actually (de-serialization of complex types in C anyone?)

But I think it's a wonderful technology.

TeMPOraL wrote at 2020-10-28 10:57:24:

I'm currently having an opportunity to learn it, as I need to interface with a piece of legacy software that works over DCOM. I thought this would be a nightmare, as all my previous brief encounters with COM/DCOM left a bad taste (obscure errors full of GUIDs, registry hacking), but now that I'm actually trying to learn what the thing is, I'm becoming more and more in awe of it.

coding_unit_1 wrote at 2020-10-28 11:23:07:

This is true for a great number of things. Our industry has a terrible collective memory - we're at such a rush to push aside the old in favour of new shiny stuff only to have to rediscover why it was useful (or not) later.

noir_lord wrote at 2020-10-28 14:03:09:

I felt the same with components.

I was using encapsulated components with internal state and defined interfaces more than two decades ago and it was old technology even then.

The web is a funny place, they grab something and claim it's revolutionary but often forget to mention _where_ it came from.

It's not a bad thing but I have often joked if you want to see what is considered state of the art in Year X go see what the academics where doing in (X-20) years.

oscargrouch wrote at 2020-10-28 18:19:31:

If you are interested in those properties, you might like something i'm working on that i'm about to launch.

Its nothing like COM, but a vision of what moderns browsers engines could turn into (focusing on the properties you have mentioned).

We should not let "them" dictate dumb clients (and apps) to us that consume from the fog that "the cloud" is.

This is a corporate trend that will take a lot of freedom from us, and if we let them have their way, it will be pretty hard to get us all out of this nightmarish future we are heading to.

TeMPOraL wrote at 2020-10-28 19:04:18:

I'm interested. Please, tell us more, and/or drop me an email (address in profile).

oscargrouch wrote at 2020-10-28 22:07:09:

Good. Will drop you an email.

taffer wrote at 2020-10-28 13:19:15:

As far as I know SOA is a concept and DCOM and CORBA are implementations of this concept. Microservices are a more lightweight variant of SOA, but basically the same idea.

tzs wrote at 2020-10-28 16:07:49:

> using saner (i.e. binary) protocols

For some reason that I fail to understand, it seems that most computer protocols and most computer file formats are designed under the unstated assumption that those using them will not have access to a computer.

golergka wrote at 2020-10-28 12:13:55:

It often happens when the technology satisfies a need the market haven't really felt yet. Microservices are hot now because many companies and individual developers have already felt the pain on their own experience. Without such experience, advanced technology just seems "unnecessary complicated" compared to the simple LAMP-based monolith that "just gets the job done".

marcosdumay wrote at 2020-10-28 13:06:35:

You will probably have a good time reading about the history of the Mach OS.

ddlutz wrote at 2020-10-28 10:54:13:

link to the paper?

TeMPOraL wrote at 2020-10-28 10:58:25:

Provided in an edit.

ktpsns wrote at 2020-10-28 08:15:40:

As some who has used the Web since 2000, I could not second more! Chrome dominance feels a bit better then Internet Explorer dominance, because the Microsoft browser was shitty about features and documentation (anybody remembers quirks mode and the IE box model?). In the grand, the overall topics did not change (in many cases it's always been about DRMs, OS support, being the first setting/implementing the standards/proposals). Unfortunately the web browser landscape got thinner. The decline of Opera and the modern Internet Explorer (what was his name again?) is a real loss for web diversity.

grishka wrote at 2020-10-28 11:59:40:

> _go back that amount of time and see how far the OS has come._

How much did it though? Strictly speaking, an OS is an abstraction on top of hardware that applications are written against. That's it. OSes of 2007 did a good job of abstracting the hardware away from applications. They also weren't as user-hostile as the modern ones, so one could argue that they were better. Operating systems had been feature-complete for quite a while. Those yearly updates serve no practical purpose.

madrox wrote at 2020-10-28 12:35:19:

This same argument could be made about browsers depending on the value one places on specific advances.

The original author is arguing there will be a point where there’s simply nothing new to add to the browser because it will do everything the OS does. There are any number of ways to argue just not how it works.

grishka wrote at 2020-10-28 13:12:24:

I personally don't think browsers need to be able to do stuff like exposing USB, Bluetooth and MIDI to websites. They aren't a substitute for a real OS as an app platform, and they should stop trying to become one. They're for reading (hyper)text, writing text (including filling forms), and watching videos.

An aside: as someone who comes from Android app development and now makes a hobby project that includes a web interface, making the browser do what I want it to feels like I'm building a UI in a Word document, complete with macros. The ability to have this free-flowing text that isn't contained in anything, this whole concept of "inline elements" still hurts my head. I'm too used to real UI frameworks where any text is only allowed to exist inside a TextView.

rmason wrote at 2020-10-28 09:16:23:

You literally couldn't have used the web much earlier than 1994 unless you were running a NeXT machine. I know because I've been on the net since 1988 and 1994 was when I first started browsing the web.

If you were on Windows it was tricky then as you had to download and configure your own tcp/ip stack. That was possible due to the ingenuity of Peter Tattam who sold software that let you do that. I met him back then and to some of us he was a true rockstar programmer.

xorcist wrote at 2020-10-28 09:49:13:

That's a bit of a stretch. The web is hard to date, 1990 is often mentioned, but it had already spread quite a bit by 1994.

Of course there's a wikipedia article on this, which counts 10k+ web sites at the end of that year:

https://en.wikipedia.org/wiki/List_of_websites_founded_befor...

For most applications you would dial up a remote system anyway so many used the web by way of plain terminal software. The Lynx web browser had existed for just over two years by then and it wasn't the first web client.

jefftk wrote at 2020-10-28 11:56:00:

I don't think I've seen 1990? You could make a case for 1991 or 1992, but I don't see how anyone could have been using the web in 1990 when it hadn't been even initially implemented yet?

For example, see "The Original HTTP as defined in 1991":

https://www.w3.org/Protocols/HTTP/AsImplemented.html

mstade wrote at 2020-10-28 11:56:10:

Case in point – in '94 I was in fourth grade at a school that made a big deal of their computer department, and how they were hooked up to the web. When geocities launched ('95?) we were making homepages as school assignments. So many "under construction" images – it was glorious!

It may not have been ubiquitous as it is today, but the web was definitely not out of reach for normal people back in '94.

kgwgk wrote at 2020-10-28 13:07:59:

Lynx was not a "web browser" in the current sense until 1993.

DrScientist wrote at 2020-10-28 14:04:47:

You could use the web before that, but with text based browsers which frankly weren't very inspiring nor popular. Most people preferred gopher etc - there was a brilliant mac gopher client.

What transformed the web was the launch of the Mosaic along with NCSA HTTPd - mostly in the backend of 1993 - it took off like wild fire - it's hard to remember dates, but I definitely remember the launch of Netscape Browser - which according to wikipedia was late 1994 - and at that point the web was already pretty popular.

Not just the web, but a large percentage of people had their own websites - it was ridiculously easy to do - just put an index.html in your ~/public_html folder.

Obviously this was all the in rarefied atmosphere of academia - and the computers were Unix work stations and most users were at University and largely PhD students at that.

lizknope wrote at 2020-10-28 13:17:56:

I started college in the fall of 1993. I went to a science and math boarding school for high school where we had Internet access. We used Archie to find FTP sites to find stuff.

I think it was Christmas break 1993 when a high school friend came to visit (he went to a different college) and started up Mosaic. I asked him what it was and then he laughed that I was still using Archie and FTP to find stuff.

I think it was xmosaic version 0.9 or something like that.

Synaesthesia wrote at 2020-10-28 10:51:47:

The first browser for windows was NCSA Mosaic. I remember installing that. When Internet Explorer came out it was so advanced! I remember it supported background images!

kgwgk wrote at 2020-10-28 13:04:36:

It seems the first one was

https://en.wikipedia.org/wiki/Cello_(web_browser)

CPLX wrote at 2020-10-28 13:01:03:

You could have also been in a college in 1994, as I was, where there was easy access to the new-ish at the time World Wide Web via every computer in every lab.

Your comment doesn’t seem accurate, it was starting to catch on pretty aggressively by 1994.

kgwgk wrote at 2020-10-28 14:57:16:

But 1994 is not much earlier than 1994, right? How is the comment you reply to inaccurate? The first non-NeXT browsers appeared in 1992 (unless you count the Line Mode Browser from 1991).

kgwgk wrote at 2020-10-28 09:40:58:

You're right. The explosion of the multi-plaform web happened in 1993

https://en.wikipedia.org/wiki/Libwww

BryanBigs wrote at 2020-10-28 12:06:21:

I was using the web in college in 1993. After my very first experience with a browser, i thought gopher was superior.

Of course, a 1200 baud connection makes graphics a bit less impressive and more of a pita.

kgwgk wrote at 2020-10-28 12:34:31:

Lucky you, that had access to a graphical browser!

My first experience was sending an email with the url I wanted to access to a server that would send back the hypertext document a few minutes later.

Edit: Something like

https://www.mcall.com/news/mc-xpm-1996-02-20-3077566-story.h...

mattl wrote at 2020-10-28 15:05:14:

August 1991.

https://en.wikipedia.org/wiki/Line_Mode_Browser

thro1 wrote at 2020-10-28 15:24:49:

You could. X.25. Gopher.

kgwgk wrote at 2020-10-28 16:47:25:

That's not "the web".

Timpy wrote at 2020-10-28 12:43:25:

That 2007 passage really brought me to a grinding hault. Toddlers are using the web these days from the moment they can swipe a screen. I have a hard time believing anybody on GitHub over the age of 20 only began using the internet in 2007.

jcranmer wrote at 2020-10-28 14:10:00:

The technology to let toddlers use the web didn't really exist until rather recently.

In 2003, we got broadband internet at our house. Before then, when we only had dial-up, only a single computer in the house could use the internet. Broadband internet brought with it external modems you could connect to via wifi rather than sitting inside your computer [1], and this meant you could use other devices than the primary family computer to browse the internet, even assuming you had more computers in your household than just the one.

The iPhone came out in 2007, and is generally credited with ushering in the smartphone revolution. The iPad came out in 2010, making a somewhat larger form factor to enable the touchscreen interaction with the computer. From observation of my 3-year-old nephew, his ability to meaningfully use the computer via keyboard and mouse is quite limited. And I'm not sure he's all that capable with the touchscreen; he still hasn't understood that the reason he can't see me is that my phone doesn't have a camera and tries pushing stuff on the phone to make the camera turn on, inevitably resulting in him pushing the "end call" button.

I can totally buy a 10-year-old in 2007 who had never used the internet, especially if they came from a poorer household that didn't have the ability to keep up with technology and lived in a school district that didn't have the ability to give all their kids laptops.

[1] This does also show my age, as I never experienced the age of external modems connected via serial cables.

fallat wrote at 2020-10-28 14:37:55:

I didn't regularly use the Internet until I was 13ish. I wanted to be fair in the timeframe so that's why I said that. I probably encountered the Internet many many times before then, but it was not for regular usage.

mdip wrote at 2020-10-28 16:01:05:

A little off topic, but thanks for sharing:

The age of 13 seems to be a common thread with a lot of developers -- I know of 5 others who will point to that age as "when they started writing software". It was age 13 when I started assembling and selling PCs (in the 90s) and began exploring software development and before my 14th birthday that I had re-written a BBS application in PASCAL.

Like you, I had been messing around with my (horrible) computer[0] since it arrived in our house and I was presented with an `A>` with no clue how to proceed, but it wasn't until I was 13 that my Dad presented me with a plan and a book to fix that problem[1]. I wanted a 386 so I could play Wing Commander (and see more than the 4 colors that my CGA display allowed -- loved the LSD trip that games like Kings Quest would put you through while trying to render decent graphics using practically nothing, but I was ready to be done with that).

I remember thinking my dad was insane -- he owned a business selling things to large automotive manufacturers ... nothing event adjacent to computers. And a computer was easily the most complicated consumer electronics device a person could own. Having had to fix our 8088 a few times, and being generally the most "handy person" I've ever known, his response was a dismissive "they're all just a bunch of parts that you buy and put together". And while I'm pretty sure he helped me build/troubleshoot the first couple of them (we had some bad components early on from a vendor in California), I generally built/tested/assembled all of the few hundred that we sold, myself. If you want a 13-year-old to behave more like a "Grown Up", well, when your Dad's friends are paying you thousands of dollars to build them a computer, you start to act like someone who's worthy of receiving that kind of trust.

The best part: we built the first several computers to _pay for_ the one we wanted for ourselves[2]. My dad _owned_ his company. I'm willing to wager that he could have afforded to just buy the thing for me and probably _wouldn't_ have felt all that guilty about spoiling his kid since I was using it 99% of the time to do things other than play games. But this served as a forcing function -- I got the exact computer with the exact specifications that I wanted without having to think about the cost of those things provided I could sell enough computers to purchase it. I did. And I got to buy a convertible for my first car with money I had earned.

I'm seeing it now, in my own kids, two of which turn 13 in the next few months. They're working with Roblox's developer tools making new worlds and I'm staring over their shoulder thinking: this application is easily as complicated to use as any of the design tools I've been exposed to and these kids are just ripping through it like it's "not work" (if only they were getting paid!).

I've been _looking_ for things to motivate them to step out of the "game world" and realize that the things they're doing could be done outside of Roblox -- they could make _real applications_, but much like I didn't believe my dad when he handed me that book, they haven't had that "ah ha" moment, yet.

[0] My friends all had Commodore 64s and such that played really cool (if slowly loading) games and cost a few hundred dollars. We had a $3,000 IBM 8088 clone with a 10MB hard drive and floppy drive that ... wasn't fun.

[1] I can't remember the title completely but it was something like "Build an 80486 PC and Save A Bundle"

[2] My 80486, if memory serves, cost just shy of $10,000. We went with what would be called "server hardware" these days: SCSI over (very new) IDE or (very old) MFM -- we went SCSI with a full height 360MB drive (40MB drives were huge), a very nice video card and a 18" CRT display (14" was 'normal' with anything above starting at twice the price of the best 14" display). And two modems... I ran a multi-node BBS on custom software originally based on Telegard.

fallat wrote at 2020-10-28 16:08:49:

Wow, really touching story :). It sounds like your kids are on a really great path with a great father.

cft wrote at 2020-10-28 08:15:15:

In 20 years it will be impossible to manually browse to an unapproved "unsafe" URL in a general purpose browser.

ktpsns wrote at 2020-10-28 08:18:38:

Well, its hardly possible to manually browse an unapproved unsafe URL even today. Ever tried to browse a site with a self-signed certificate? In most browsers you have to do a lot of extra clicks to get in.

But there will always be special purpose browsers. There is "TOR browser" today, and we will have "vintage web browsers" the same way we have CLI/TLI browsers still today (thinking of elinks and lynx), or the same way as there are still FTP clients and Gopher clients.

im3w1l wrote at 2020-10-28 11:04:06:

iPhone doesn't allow true 3rd party browsers only reskinned Safari. Apple even removed a browser for extending safari too much [0]. So while there will be vintage web browsers, normal people may not have access to them.

https://www.theverge.com/2020/10/20/21524665/stadium-stadia-...

TheCoelacanth wrote at 2020-10-28 14:49:42:

All the certificate verifies is that the server you are connected to actually belongs to someone who controls the domain that it's hosted at.

You can get one issued for free just by demonstrating that you actually control the domain.

ktpsns wrote at 2020-10-28 22:42:36:

There is so much on TLS beyond Let's Encrypt. For instance, TLS allows user certificates to authenticate the user, not the server. This works a bit like SSH passwordless login (thanks to key authentication). This technique only works if the site operator also controls a CA which the browser accepts. Companies can easily install self-signed certs to do so, but on the wild web that's quite difficult.

So yeah, even today there are use cases for custom CAs.

TheCoelacanth wrote at 2020-10-28 23:33:12:

I'm aware that client certificates exist. I'm not aware of anyone actually using them.

It also doesn't really have the problem of "unapproved" sites. You are already installing a certificate on the client. It's no extra work to add another trusted CA at the same time.

cbozeman wrote at 2020-10-28 08:54:03:

This is the real nightmare scenario...

pessimizer wrote at 2020-10-28 13:10:53:

They'll be complete when users have absolutely no way to interact with a web page other than what was explicitly intended by the company serving the web page, and when every possible interaction between the user and their computer is reported back to the the web page they're accessing.

There will be three browsers: The "Safe Browser" which will be used by 85% of people, the "Smart Browser" which will be used by 5% of people, and the "Rebel Browser" which will be used by 10% of people, most of them in their teens and 20s. They will each have different skins (and t-shirts), but all be identical under the hood. The Rebel Browser will allow you to consent to accessing porn, and when the Smart Browser reports that it is the Smart Browser to sites its users visit, they will only be served articles, ads and search results that make them feel smarter than everyone else. The Safe Browser will have an "Important Business" mode that will allow you to consent to accessing porn, but it will examine your face first to ostensibly determine if you're a serious adult. The Rebel browser will also silently examine your face, and for the same reason - to track your porn preferences.

The day it will be feature-complete is when sites get an API for bricking your computer, which is routed through an algorithmic federal judge.

userbinator wrote at 2020-10-28 13:44:04:

...and there will be a 4th one used by almost 0% of people, which is illegal to use along with the general-purpose computer that's required to use it, but provides the user complete control:

https://www.gnu.org/philosophy/right-to-read.en.html

https://boingboing.net/2012/01/10/lockdown.html

IMHO the web started to go down this path in the late 2000s. Recent developments (manifestv3, DoH, youtube-dl, etc.) are only further indications of the continuing trend towards increasing corporate control and dumbing-down of the users (so they become less likely to oppose.)

MacsHeadroom wrote at 2020-10-28 14:33:51:

This is presented in a sort of hyperbolic way. But I do really think this is the future- and not as far off as people think. But not for nefarious reasons. The road to hell is paved with good intentions.

Average users value performance, cost, and battery life. The cost efficiency and capabilities of cloud compute will always outpace that of consumer mobile devices. So remote compute with zero user control is inevitable.

A stadia like experience for everything is what the overwhelming majority of consumers want- they just don't know it yet.

pessimizer wrote at 2020-10-28 15:24:10:

Yet somehow every site became a single-page app rendered client-side. I'm absolutely certain that browsing consumes at least twice the power it did 10 years ago (and 5x the RAM.)

I visit a lot of the same pages I did 10 years ago, but now they take as long as they would have taken 20 years ago to render.

alexwennerberg wrote at 2020-10-28 15:26:37:

> Average users value performance, cost, and battery life. The cost efficiency and capabilities of cloud compute will always outpace that of consumer mobile devices. So remote compute with zero user control is inevitable.

I don't think this is a question of what users value, it's a question of what large software companies value. Non-technical users absolutely value their privacy and freedom even if they don't put it in explicitly ideological terms: they want to pirate copyrighted material, they are annoyed by paywalls and ads, they are creeped out by how much social networks knows about them.

pjc50 wrote at 2020-10-28 14:40:51:

I'm more or less resigned to this being a continuous struggle with ongoing turnover of tech. The '00s solution for preventing people from interacting with your web page in unapproved ways was to do it all in Flash; that eventually died.

LargoLasskhyfv wrote at 2020-10-28 18:39:13:

Stunning. Your nickname is really fitting :)

gbh444g wrote at 2020-10-28 07:15:11:

"There is not much left for a Web browser to cover" - this reminded me of the famous "physics is mostly complete at this point."

Web (browser) is the only cross-platform operating system, even if some standards say otherwise. I'm sure we'll see docker-style virtual web environments, cross-machine abstractions to simulate datacenters for distributed computing (GPU and such), PWA-style webapps bootable from USB sticks and functioning as operating systems, various crazy extensions for the upcoming neuralink-style tech, virtual reality frameworks, holographic displays and so on and so forth.

Physics is never complete at any point.

bullen wrote at 2020-10-28 10:06:26:

Physics are not changing, it's our understanding of physics that changes, and yes we are very far from understanding them correctly.

The real questions are:

- For how long do we have energy to keep not understanding things.

- Are the understandings leading to real improvements or just more total energy consumption (for every energy reduction there is an equal or more increase in total energy consumption)?

I'm commited to changing to a Raspberry 4 as my main computer with Raspbian (now called RaspberryOS, is that an improvement though) and Chromium before the year is over.

I wish new standards would consider all the effects before releasing, so we can stop reimplementing the wheel and actually build a bike at some point.

Reinventing the wheel for the browser has more or less stalled at this point, and it's sad because I'm all for that:

http://move.rupy.se/file/wheel.jpg

Software really worked ok back in the C64 days too, the only real improvement since then is 3D and my position is that OpenGL (ES) 3 with VAO is good enough.

But you are welcome to use Vulkan, Metal and DirectX 12 if you want!

Igelau wrote at 2020-10-28 11:29:02:

> Physics are not changing, it's our understanding of physics that changes

No, it's right the way it is. Physics is a science. Physics _is_ the understanding. The universe is terrain and sciences are maps -- though some are more localized and specialized than others.

ImprobableTruth wrote at 2020-10-28 11:46:23:

>Physics is a science

'Physics' refers both to 'our knowledge/study of the physical' and 'the physical' in itself.

catmanjan wrote at 2020-10-28 07:26:51:

The fact that we're comparing a type of software to a school of science speaks volumes

jupp0r wrote at 2020-10-28 08:12:59:

Is it? You could make the same point about engineering, literature, the arts, sociology, ... you get the idea.

tannhaeuser wrote at 2020-10-28 08:38:34:

I'm not getting the idea. The argument, as I understand it, was that a browser should aspire to be a standardized and relatively simple app enabling a large variety of backends to deliver content, not a never-ending "social experiment" to become overwhelmed by a single party extracting almost all value off the web, and becoming the only party with the resources to implement browsers.

jupp0r wrote at 2020-10-29 17:12:36:

The notion that the world wide web is "finished" at some point is just flawed. Today we do video conferencing, 3D gaming, biometric authentication, photo editing and all sorts of other things on the web. Browsers will evolve along the lines of users needs and developers innovation. If this stops at some point, they will become the new Gopher and something else will come along to replace them.

AnIdiotOnTheNet wrote at 2020-10-28 11:47:14:

> Web (browser) is the only cross-platform operating system, even if some standards say otherwise.

I'm really struggling to think of a combination of definitions for "cross-platform" and "operating system" that can make this statement true.

swiley wrote at 2020-10-28 12:29:48:

I think they meant language runtime/API (for network, graphics etc.) which is generally how developers see an operating system.

cm2187 wrote at 2020-10-28 12:45:36:

Rather the new POSIX perhaps.

iso1210 wrote at 2020-10-28 08:51:09:

> "There is not much left for a Web browser to cover" - this reminded me of the famous "physics is mostly complete at this point."

Reminded me of microsoft saying they would no longer develop IE6 because browsers were done (i.e. netscape had gone bust)

Izkata wrote at 2020-10-28 17:04:53:

> I'm sure we'll see docker-style virtual web environments, cross-machine abstractions to simulate datacenters for distributed computing (GPU and such)

Running Windows98 in javascript:

http://copy.sh/v86/?profile=windows98

marcus_holmes wrote at 2020-10-28 08:01:37:

I've seen the same cycle happen repeatedly in tech (there are many cycles, but this is one of them): systems are centralised on a server, then the client gets more powerful and features of the system are pushed to the client. This gets too much and the client slows down, so the features are centralised to a server again.

I can easily see a world where Chrome/ium becomes too bloated to work well, and a "new web" of thin, light, browsers with drastically reduced features but increased speed appears. They'll work with "fat web" servers but throw away half the features (so we'll end up designing two versions of each website).

Eventually most of the functionality of the old "fat web" will migrate across - obeying the 90/10 rule (the 10% of features that provide 90% of the value). With 90% of the value, plus increased speed, everyone will move to "thin web" browsers. Chrome/ium will be left where ActiveX is now - some non-bleeding-edge organisations will use it because it has features that didn't make it to the "thin web" and replacing it is a pain, but it's obsolete and no-one really uses it.

nwah1 wrote at 2020-10-28 15:18:17:

Google AMP and Facebook Instant Articles are doing this.

marcus_holmes wrote at 2020-10-28 15:31:52:

Yeah, you're right. I hoped it wouldn't come from the giants, but as a grassroots tech thing, but I guess we don't live in that world any more.

nwah1 wrote at 2020-10-28 17:03:57:

The attempts I've seen from the community seem intentionally hobbled. Gemini intentionally doesn't even support inline images.

madrox wrote at 2020-10-28 08:03:27:

Stadia is a great example of this

Tom4hawk wrote at 2020-10-28 11:49:15:

Stadia is just Google trying to centralized another thing. Their pricing is ridiculous. Group of people that are ready to pay monthly fee, pay full price for games (and remember - no used copies[big market on consoles] and no competing stores[PCs]), live so close to the Stadia servers that latency is not a problem and have a good internet connection is unbelievably small.

willseth wrote at 2020-10-28 15:29:55:

The author has very limited knowledge of the web, its history, or the W3C's ambitions (not to mention WHATWG, IETF, ECMA, etc.). His view of the web in general is bafflingly myopic, with an almost completely arbitrary understanding of the businesses discussed. It should come as no surprise that the conclusions he draws are non sequitur.

So why has this been upvoted? Why is this on HN at all? It's ignorant nonsense. This should be obvious to anyone who's spent more than a few hours researching these topics. Please stop taking no-nothing randos seriously.

ritchiea wrote at 2020-10-28 15:49:28:

Lots of users vote based on the headline before actually reading the article. Lots of users even comment without reading the article. HN is an extremely large community these days.

billyjobob wrote at 2020-10-28 15:43:46:

It does say "As someone who has used the Web since 2007" so I guess it's written by a child? So probably we shouldn't be too harsh judging them.

coldtea wrote at 2020-10-28 12:45:09:

Well, web browsers once meant hypertext and browsing, that is reading and navigating hypertext. That was complete (or close) 10-15 years ago, in the sense that it could do render most things print magazines could do with good enough html/css APIs.

But that's not what people (including companies making SaaS, which are also stakeholders here), want from the web.

They want an app delivery mechanism mediated through a web browser.

So web browsers will be complete the same time when OS-level frameworks, UIs, and libs are complete, so when computing is "complete".

That is, never.

err4nt wrote at 2020-10-28 07:09:18:

W3C does not publish all web standards, notably it doesn't publish the three biggest technologies used in web browsers:

- HTML is worked on by WHATWG

- CSS is worked on by CSSWG

- JavaScript is worked on by TC39

And IE wasn't killed by features moving too fast, Microsoft made Edge browser and still does. Mozilla hasn't been killed by Google or competition, it's been kept alive in large part by Google (I think in some years as much as 90% or more of its total income coming from Google) so they get hundreds of millions—the problem at Mozilla is bad management for the last decade.

I'm not sure how well this vision represents reality as it already is today.

gsnedders wrote at 2020-10-28 07:14:40:

> - CSS is worked on by CSSWG

The CSS WG is a W3C WG, and they're very much published as W3C specifications.

err4nt wrote at 2020-10-28 07:25:49:

I just don't get the impression OP is even aware there are groups in existence working on web standards apart from W3C.

detaro wrote at 2020-10-28 07:36:56:

Generally I think seeing W3C as "a group" is not a great (but common) perspective. While it provides some process framework, which certainly impacts how things happen, and not just in a good way, in the end it comes down to the working groups doing the actual work, and the people and companies active there.

coldtea wrote at 2020-10-28 12:36:49:

>_And IE wasn't killed by features moving too fast, Microsoft made Edge browser and still does._

No, they don't. Edge nowadays is just a Blink shell.

slightwinder wrote at 2020-10-28 14:24:19:

And there we have another loop. From IE Shell to Blink/Chrome Shell. Who knows which dominating engine we will have in another 20 years.

why_only_15 wrote at 2020-10-28 07:13:44:

Microsoft makes a browser but the point people are typically making here is that Edge is just a reskin of Chromium.

err4nt wrote at 2020-10-28 07:24:24:

Between IE and the current Chromium-based Edge browser, was born, lived, and died another browser engine named EdgeHTML.

IE was 2 browser families ago from Microsoft.

Gaelan wrote at 2020-10-28 09:52:55:

I thought EdgeHTML was just Trident with some legacy code ripped out.

Edit: admittedly, "just" is probably underselling it a bit, but it's still a far cry from a completely new engine.

gsnedders wrote at 2020-10-29 14:57:51:

It was very much a fork of Trident, yes. I believe there was significant refactoring done after the legacy code was ripped out.

coldtea wrote at 2020-10-28 12:39:14:

Yes, but that was killed because Chrome moves too fast, and maintaining EdgeHTML wasn't easy for MS.

Parent said: IE wasn't killed because the web moves too fast -- and used the presence of Edge (or EdgeHTML) as the argument.

But both IE and EdgeHTML were killed because Chrome moved too fast for MS to be worth it to maintain its own engine...

modo_mario wrote at 2020-10-28 14:57:53:

>the problem at Mozilla is bad management for the last decade.

It sure is a problem but I'd say Google's market dominance not being an issue is false. Remember when H264 became the standard just because Google decided it after what must've felt like a fakeout to mozilla. Then had to start working on a non proprietary implementation of what google already had ready.

This was back when people consistently complained bout firefox having fallen behind because a single youtube tab could make the entire browser lag and when chrom(ium) wasn't as dominant as it is today.

tannhaeuser wrote at 2020-10-28 07:26:28:

_there is not much more

distance to cover [relative to native toolkits]_

As I see it, the web as originally conceived was "ready" around 2003, and then again around 2009 when CSS media queries became mainstream to adapt to mobile usage. For me, the excessive push for browser APIs as a desktop replacement since then, and the proliferation of frameworks is a generational thing, driven not by technology but by big data, SaaS (prospect of extracting recurring revenue from customers, rather than one-time license fees), and a new generation of developers getting into the game by leaving the status quo behind, solely for their own economical benefit. Google has masterfully played this game, and let the idea of open web standards bend the minds of developers who thought they were clever. Moz partnering with them under WHATWG seemed like a good idea at the time, but just made the concept of a web standard arbitrary and "whatever Chrome does" over time, leading to Moz's demise and that of every other "browser vendor".

It'll take some stance and political will to reclaim the web.

cageface wrote at 2020-10-28 07:39:48:

The alternative is an internet completely locked down in native apps in proprietary and unaccountable app stores. I'll take a bloated web browser over a corporate monoculture any day.

adrianN wrote at 2020-10-28 08:10:12:

I don't see a freedom or accountability difference between native apps from an appstore and webapps from a webserver.

cageface wrote at 2020-10-28 08:46:14:

It's very simple. One requires the good graces of an unaccountable and opaque corporate review process. The other requires nothing more than a simple webserver using open standards hosted anywhere in the world.

tannhaeuser wrote at 2020-10-28 09:47:39:

Hosted at an IP that can be blocked, over a protocol (HTTP/3, QUIC) so complex and opaque that almost no F/OSS implementation exists, and takes away transparency over regular DNS resolution, that is.

xorcist wrote at 2020-10-28 10:04:07:

The play around control of DNS infrastructure is an interesting one, and it will be exciting to see how that plays out over the coming decade.

Should a trifecta of corporates control resolution, it would be only natural for alternative roots. Over a certain market penetration threshold this could form a perfect moat. Everything that made AlterNIC and its ilk fail is irrelevant now.

tannhaeuser wrote at 2020-10-28 07:52:04:

Yes the universal portability argument and fear of closed ecosystems is the carrot to put in front of a generation who experienced MS monoculture and the dangers of app stores not welcoming to GPL and other F/OSS licensing scheme. With the consequence of the web becoming a closed network itself, like AOL and CompuServe which the web was intended to replace. It's a selfish, narrow-minded motive of nerds putting their stuff into the browser using their preferred language and uniform platform/toolchain, without benefit for end-users.

cageface wrote at 2020-10-28 08:40:03:

What benefits end users can change very quickly. Take Apple yanking HKMaps as an example. Dictatorships have always had some advantages over messy democracies.

konjin wrote at 2020-10-28 07:47:47:

Sorry which is which? Because right now Chrome engine has more control of the web then IE did in it's heyday when Microsoft was ruled to be a monopoly.

modeless wrote at 2020-10-28 08:00:49:

Sorry, that's just wrong. Non-IE browsers had less than 5% market share at its peak. Non-Chrome browsers have more than _6 times_ that (~30%). And web standards are developed much more collaboratively these days, with less unilateral action.

cageface wrote at 2020-10-28 08:42:54:

The current situation isn't ideal but I can still build apps and reach users without the permission of any company's review board. There's a big difference between controlling technology and controlling content.

vntok wrote at 2020-10-28 07:51:01:

Are you aware that Blink, Webkit and V8 are all open source?

pjmlp wrote at 2020-10-28 08:10:11:

With most people working out of Apple, Microsoft and Google salaries, good luck getting a bunch of random weekend coders to pick up and counterattack an alternative browser based on that open source.

MaxBarraclough wrote at 2020-10-28 10:57:01:

Doesn't this depend how high we set the bar?

The _ungoogled-chromium_ project [0] presumably doesn't take all that much manpower as it's just taking Chromium and paring it down a little. A true alternative browser codebase like Firefox, with its own rendering engine and JavaScript engine, takes a great deal of effort, of course.

[0]

https://github.com/Eloston/ungoogled-chromium

pjmlp wrote at 2020-10-28 12:20:31:

Depends, check the recent stories about X.org being abandonware, yet it is open source.

bloody-crow wrote at 2020-10-28 07:48:27:

Quick comment on the presentation. This post could very easily be markdown and therefore rendered in a nice readable non-monospaced font. Instead it's a .txt and it's really painful to read.

Edit: here's a rendered markdown conversion that is easier on the eyes:

https://shrib.com/?v=nc#CapeStarling-6v1L9J

tleb_ wrote at 2020-10-28 09:33:45:

How is that easier to read?

https://wtf.roflcopter.fr/pics/SpRqNBvB/0QUVpbN4.png

bloody-crow wrote at 2020-10-28 17:04:58:

Ugh, it looked better on my screen.

eCa wrote at 2020-10-28 13:20:38:

I recommend the raw rendering, followed by the browser’s reader mode.

But I agree in principle.

kristopolous wrote at 2020-10-28 08:07:29:

The HTML/web model is still fundamentally incomplete. Links are one way, things don't flow based on concept but based on this clunky chunk called a page. We still only have one functional semantic for cross linking the clunky chunks called the hyperlink.

The only way we have to find things are to use companies that have managed to visit and index all the sites, not through some search and index interface but by processing the page that the human sees and then trying to rank them making the web ultimately heavily centralized.

Then there is the authorship meta layer that doesn't exist to interleave content from the same author, say by email or some other identifier. Instead we've centralized that into temporal structures that vanish when you look back, like in a dream in a "let 1000 flowers bloom" method where all you ever end up with are just a bunch of silly flowers.

Then there is the permanence feature where the internet is like a river that's always moving. Someone else has to rely on generosity to yet again, centralize the historicals so hopefully things can exist at least as long as bad ink on terrible paper. Otherwise a couple generations of thought will be in metal garbage heaps

I could have written all this, word for word in 1993 but instead it's 2020,

Honestly, I'd say solutions to 70% of the fundamental problems are still unsolved nearly 30 years later.

xorcist wrote at 2020-10-28 10:08:41:

Incomplete compared to what?

There are no hard dependencies between web sites and links fail all the time. The web didn't succeed in spite of this, but because of this.

The decision to use one-directional links was a good design choice in retrospect. It has also kept the web agile enough to keep evolving.

slightwinder wrote at 2020-10-28 14:32:32:

> Links are one way

Is it even possible to change this? We did had (or still have?) backlinks in the blogging-world (trackback/pingback), but this is some active mechanism which simple does not scale well for physical reasons.

> things don't flow based on concept but based on this clunky chunk called a page

The base is a document, which describes the content and interface. "Page" is just the naming for some. Some others are called "App". What else is their?

> We still only have one functional semantic for cross linking the clunky chunks called the hyperlink.

We do have other formats than HTML. HTML had different approaches to include more stuff. Did not work for reasons. AT the end all dreams shatter on reality.

tannhaeuser wrote at 2020-10-28 08:28:43:

Regarding two-way links etc., XLink/XInclude, based on SGML and HyTime concepts, was proposed by Sun back then in 1998 or so. Thankfully, that didn't take off.

qppo wrote at 2020-10-28 07:45:54:

I can't read this plaintext in a mobile browser window so clearly there's some work left to do. Or maybe they've done too much work. Who knows.

kevsim wrote at 2020-10-28 09:43:08:

Reader mode in safari worked OKish for me.

amadeuspagel wrote at 2020-10-28 11:26:24:

This focus on chrome os gets it completely backwards. Google doesn't expand chrome's capabilities because they need it for chrome OS, google invests in chrome and chrome OS, because it supports their search engine business. Google wants anything you might want to do with your computer to start by searching google, not some kind of app store. See:

https://www.gwern.net/Complement

ksec wrote at 2020-10-28 07:30:49:

Let's talk about Web _Page_ for a minute, or Web _Page_ with some _interactivity_ and _animations_.

Are there any Web Page, render in Web Browser, both the page itself, and the standard itself, that gives the smoothness of say opening a PDF, or even better, a Native App. Does the Apple Store Web Page for selling iPhone felt as smooth as their Apple Store App on your iOS?

Jank ( micro- pause ) are still everywhere. Both the Browser and the WebPage itself. And this isn't Web App, just simple Web _Pages_.

I say web browsers ( and the Web Standards ) is far from complete.

the_gipsy wrote at 2020-10-28 07:39:50:

I don't remember the app store app being smooth.

I do agree about the "janks".

Existenceblinks wrote at 2020-10-28 15:46:10:

I have a complete unlovely thought; web browser needs a rewrite with thin web specs (imagine javascript the good part book meme). People think it's impossible to rewrite, but I believe it's only because it's a mess. Inventing a new programming language and new DOM api that works greatly with webassembly probably worth it, consider that the next 10 years the web is still not going anywhere. You may laugh at Flash .. but hey do you love "hydration" thing? I'd love to have an isomorphic?(terrible name) template engine that rendered as text as well as client program binary. Do you love OS based form control that can't be styled? I'd love a complete separate controls from OS, clean the contract between widget and actual system api. Just stop caring (coercing) how it looks, just provide api to enforce how its functionality works, we already have these bunch of aria- thing.

The question is what block the progress of making it complete, what kind of problem? I believe it's not a technical problem. Maybe incentive?

acoard wrote at 2020-10-28 15:57:57:

It's both technical and incentive.

The problem is that so much of technology is standing on the shoulders of giants. If we re-write, how much of the giants do we decide to lop-off?

For example, could you get Google, Apple, Mozilla, to all agree where to draw the line? Should we still use HTTP? Should everything use QUIC? An entirely new standard? What about DNS? But let's say they do agree.

Then what, they create entirely new browsers (I'll call Chrome2, Safari2, etc). Maybe they don't even use the DOM or HTML at all, but an entirely new setup. A side note – if the web was re-built today it probably wouldn't have "View Source." As you said, I imagine it'd lean into WASM and "isomorphic" code, which could just be WebGL/canvas/some new paradigm/etc.

So then what... everyone says abandon my original website and download a new browser to access the new one? Because that's what throwing out backwards compatibility looks like. And that seems like a non-starter to most people.

Browsers are too big to fail; too big to be thrown out and re-written. Our hope is incremental, not revolutionary. Things like WASM in browsers let us pursue new paradigms while also being backwards compatible.

Lastly, I'd say the incentives for a new web would be worse than an old web. The original web was written by tinkerers and tech startups like Netscape/Mozilla. The new web would be written by multibillion dollar companies who like and know how to build moats. It would have microtransactions (not necessarily bad by itself) and I worry would look more like an app store.

Jyaif wrote at 2020-10-28 21:23:21:

> web browser needs a rewrite with thin web specs

Flutter was started by Chrome engineers asking the question "what would the web look like if we could strip out all the things that makes it slow without worrying about breaking compatibility?"

nikitaga wrote at 2020-10-28 07:44:40:

Chrome OS being open source is not a threat to Google for several reasons:

The usefulness of the browser is that it's a cross-platform platform. You can run the same web apps on Windows, Mac, Linux, even mobile devices.

Turning the browser itself into an OS-level platform will just create another platform that will need to compete with all existing platforms while having nothing unique to offer – after all, the same web apps run just fine on existing platforms, by their very design.

What does ChromeOS have that MacOS or Linux do not have and can't have?

You could argue that someone could build a competitor to iOS and Android using ChromeOS, but why wouldn't the same someone build a competitor using... Android? It's open source too.

Because if they take Android source without Google's blessing, they would be left without Google maps, docs, and gmail. But Google can just as easily exclude mobile browsers from accessing their web services. In today's world it would be perfectly fine for Google to only offer their services through native apps _on mobile_, and it's up to them which platforms they choose to publish those apps on.

Oh and then there's Widevine. You need Google's blessing to decrypt that, and even if you get it, every content provider that uses Widevine DRM can block access to their content from your browser for any reason (you can't bypass that).

How far will your rebel browser platform go without being able to play e.g. youtube? Currently they only DRM certain types of videos, but what would stop them from flipping the switch and applying DRM to every video that is monetized on youtube, and blocking your browser on the basis that it allows adblockers that effectively prevent funding of those videos?

The notion that Google is somehow – anyhow – in a weak position is utterly ridiculous. Their moat is stronger than ever.

pjmlp wrote at 2020-10-28 08:12:11:

With the Chrome monoculture, the Web is effectively ChromeOS nowadays, specially with the increasing amount of APIs that are Chrome only, which although other browsers refuse to implement really doesn't matter as their market share keeps decreasing towards zero.

hesarenu wrote at 2020-10-28 08:46:36:

If chrome advances it does not matter. The lowest common denominator would be safari specifically ios safari. Until it adopts we would have to code to it's standard.

pjmlp wrote at 2020-10-28 08:47:38:

Safari only matters in 10% of the world, plenty of countries don't need to worry with customers using Safari, as no one uses any kind of Apple product and those companies have more than enough to do with their local markets.

hesarenu wrote at 2020-10-28 10:09:41:

10% is still a big big big number. You can't ignore it. When website fails on someones safari you can't just say we did not consider anyone using safari or iphone. You always try to make it work on most browsers.

pjmlp wrote at 2020-10-28 10:11:41:

Sure we can, as mentioned, there are hundreds of countries with zero presence of Apple hardware, and not everyone is playing Amazon selling to the whole world.

Local market is more than enough to keep ones busy and profitable.

See how many companies in Africa, South America or Eastern Europe countries actually care to buy Apple hardware to test Safari.

names_are_hard wrote at 2020-10-28 10:57:06:

Nit: There aren't hundreds of countries that have zero presence of Apple hardware because there aren't hundreds of countries in the world. Depending on how you count, the are about 195 countries in the world today. I think you need at least 200 to say "hundreds" and even then it's a stretch.

pjmlp wrote at 2020-10-28 12:19:43:

Thanks, but that is a form of speech, doesn't mean necessarily a numeric value. :)

https://www.collinsdictionary.com/dictionary/english/hundred

> If you refer to hundreds of things or people, you are emphasizing that there are very many of them.

hesarenu wrote at 2020-10-28 12:10:49:

And there are hundreds with Apple hardware even once which cause an issue. It might be ok for corporate to dictate browsers but other can't ignore any percentage of browsers. I see many comments on how it works only on chrome, you would assume that we should ignore that since it majority why should be optimize for others. See we even can't optimize for chrome since we have safari which has some functionality not present.

pjmlp wrote at 2020-10-28 12:24:35:

Developers that keep pushing for Chrome are the ones to blame to start with.

Happy Firefox user here since Netscape Navigator days.

hesarenu wrote at 2020-10-28 12:56:15:

On mobile blame goes to Apple for not allowing other engines. You could have used Firefox in iOS devices as well.

pjmlp wrote at 2020-10-28 14:35:14:

That is what saves the day of Safari actually being relevant at all, instead of Chrome overall.

I am an happy Firefox user, but don't fool myself, without Safari, Web === ChromeOS.

hesarenu wrote at 2020-10-28 16:04:53:

Then on mobile it would be Safari. And mobile would be the future growth.

pjmlp wrote at 2020-10-28 16:40:47:

Yep, if we overlook that in many countries there are no iPhones to start with, so again it doesn't matter for companies selling 100% to local markets.

drdec wrote at 2020-10-28 14:42:10:

> Chrome OS being open source is not a threat to Google

Agreed. I always though that Chrome and ChromeOS were created so that Google could ensure that a platform always existed for their real products (which are offered over the web).

pmontra wrote at 2020-10-28 12:49:10:

Having used the web for twice as many years as the author did I feel optimistic about betting that web browsers will be complete only when nobody will use them anymore. We're very good at inventing new stuff and OSes and browsers have to keep up.

akrymski wrote at 2020-10-28 19:45:47:

Alternative end game: the browser dies slowly, gradually being replaced by native apps.

We're already seeing this: consumers prefer native apps on their phones for virtually everything: shopping, socialising, reading news.

Surely the future us will laugh at web browsers as some ancient tech that used these weird markup languages to layout content that's monetised with ads into pages that you'd navigate back and forth in order to emulate an application?

New web browser engines are now impossible to implement due to the sheer size of specs. Surely we are nearing the end game?

sfink wrote at 2020-10-30 15:30:39:

Consumers prefer native navigation (go straight to an app, not home -> browser -> app) to apps that in practice are native wrappers around web views.

mdip wrote at 2020-10-28 15:22:57:

... used the Web since 2007

Am I really that old?

Something that was stated, but not directly, is what's driven a lot of the latest additions to the browser: Cross-platform development.

The operating system provided the abstractions around various hardware implementations in the past, which meant being able to (possibly) run your software, if re-compiled, on different hardware platforms (it reality, some OSes weren't available except on a single platform and others were frequently not straight forward to get compiling on different platforms).

The browser is providing abstractions to the same hardware around various OS implementations. Add in WebAssembly, and you can pick from a number of languages -- including those that were traditionally used to write apps not designed to run in browsers.

But to the quote: I was cleaning out my basement, which quickly became a "treasure hunt" of sorts since I've been storing things that I took from my parents' house when I moved out.

My favorite find was a CD-ROM that was basically a CD-based Search Engine of the Internet. It was "Designed for Windows 95" and was probably a free disc given to me by a vendor rep when I worked at CompUSA in my teens. I'm tempted to see how many of these actual sites still exist (I'm sure _none_ of the deep links do). Incredibly, these things probably sold very well back then. It even came with a browser (neither Mosaic nor Netscape; no clue who's it is).

mschuetz wrote at 2020-10-28 07:35:40:

Web browsers are just getting started, in my opinion. Browsers already are the most effective way to share content with a massive audience, and WebGPU and WebXR are going to lead to a whole lot of new types of content.

moritzwarhier wrote at 2020-10-28 17:26:55:

I hope this doesn't classify as trolling;

But isn't it ironic that the author uses a GitHub gist to share this plain text essay?

Had to click "View Raw" to read it on mobile.

I know the essay is about more topics than just documents vs apps. Still that shows to me that "The principle of least power" doesn't easily apply to technology choices of web users.

GitHub was probably more convenient for the author than uploading a txt file somewhere.

crazygringo wrote at 2020-10-28 13:17:22:

_As we move forward, eventually, there will be nothing left to propose for the Open Web Platform. We will reach feature-parity with our local Operating Systems._

But operating systems continue to innovate/improve/change at roughly the same rate as browsers these days, no?

So it seems like a total logical fallacy that the browser would ever be complete. People always come up with new ideas and better solutions for things users need, new hardware arises that needs software support and input, etc.

fallat wrote at 2020-10-28 07:48:39:

OP here (about to sleep): this was just some 2am ramblings that I found could invoke some interesting conversation. Already there are some great conversations. Night!

smitty1e wrote at 2020-10-28 10:05:22:

This seemed a key point:

We can logically follow this to a future of a

"split Web". There's going to be the "Google Era" Web that will live for a

long time, and another iteration of the Web where us technologists will try

our best to create something more to our liking.

All human endeavors tend toward Towers of Babel. Alternates shall arise and topple Google.

dmitriid wrote at 2020-10-28 07:37:31:

The main problem with the current state of web browsers is that it's now impossible for a third party to develop a competing browser from scratch (or nearly from scratch). The already existing set is unimaginably huge, _and_ it's a moving targets as browsers (especially Chrome) is throwing more and more features into it.

tannhaeuser wrote at 2020-10-28 08:53:41:

And WHATWG (and CSS WG) rejects the idea of HTML profiles (defining reasonable subsets) and even versions, or anything forward-compatibility for that matter, with supporters standing by and cheering.

anonymou2 wrote at 2020-10-28 10:20:02:

that's why project Gemini was created:

https://gemini.circumlunar.space/

lazyjones wrote at 2020-10-28 08:27:39:

That's on purpose, of course. And everyone who suffers from it, willingly tolerates it.

tabtab wrote at 2020-10-28 15:52:17:

I believe the argument is that Chrome's code-base is becoming the de-facto browser standard, which will allegedly simplify things by making a stable platform. But it's been hard to predict browser trends. Look how Chrome came out of nowhere and knocked Microsoft's off the top. Netscape also got sucker-punched. Who knows what will punch next.

Web standards still have some glaring gaps that I'm not sure the current approach can tame. I propose "the web" be split into three standards:

1. Media/Games/Art/Video

2. Documents (existing HTML may be good enough with minor tweeks)

3. Productivity: CRUD and Data oriented interactive GUI

By splitting, each standard can FOCUS on doing what it does best rather than the watered-down one-size-fits-all we have now. A given browser may support all 3, somewhat like how Java Applets and Flash could run inside of HTML documents. (They may be pluggins and/or independent browsers.)

But unlike Applets & Flash, the standards wouldn't try to be full virtual OS's. Dump as much on the server as you can to keep client standards lean, and reduce the need for scripting by including common behaviors.

For example, a common GUI idiom is for a button to activate a window or form. Rather than rely on scripting, that action could be built into the GUI markup:

<button openTarget="myWindow">
        Click to open My Window
    </button>

Or maybe:

<button label="Click to open My Window et. al.">
        <action>
            <setFocus target="myWindow">
            <setFocus target="anotherWindow" modal="false">
        </action>
    </button>

The point is that if common GUI idioms and behaviors are built into the GUI markup standard, scripting (typically JavaScript) is not needed as often. The GUI Markup Language would be state-friendly and natively interactive, unlike HTML.

benjaminjosephw wrote at 2020-10-28 07:28:33:

I think the disruptive innovation for the mono-culture in browser development will be web platforms that enable meaningful and broadly accessible end-user programming. The established players have no incentive to pursue this objective (since it undermines central ownership of data/experience) but, if executed well, could be an extremely valuable proposition for users. Google's current trajectory won't take us there but maybe others can.

zokier wrote at 2020-10-28 11:33:46:

Personally I feel browsers got feature complete already somewhere around 2011, Firefox 4 was a great release and did pretty much everything I would want from browser. Last really important spec feature I feel was video element, which was added around that time. Got us rid of need for crappy plugins.

Honestly I'm not sure what I'd miss if I had to go back to Fx4.

BlueTemplar wrote at 2020-10-28 14:36:50:

Yeah, IMHO browsers got worse in recent years, first with Opera 12 being killed off, then Firefox losing most of the power of its features/ plugins... (I still can't find anything nearly as good as Tab Groups !)

c-smile wrote at 2020-10-28 14:14:06:

Why plural? Probably "the web browser" is more correct term these days, no?

The idea of "browser as OS" or "OS as browser" is not that new. That move started by "active desktop" concept on Windows 95 (

https://en.wikipedia.org/wiki/Active_Desktop

).

And speaking about browser... It's main goal is to serve following tasks:

1. To provide safe browsing experience.

2. To present content to the user.

Note that the order of these two tasks matters.

Initial active desktop effort has died because order of these two tasks was wrong at their first attempt.

From that point of view, "browser as OS" just increases attack surface. And so more and more isolation layers will be added into browsers making them less effective and performant than alternatives.

Consider WebAssembly for example. That was desperate move to provide an option to run some computation-heavy code inside browsers with at least comparable speed of native code. But it always be slower than native solutions. Java vs native code, we've been there, seen that.

To be short: native OS and solutions will always be more performant than the one _emulated_ in the browser. Just because of the order of tasks.

In this sense the browser will never be complete. It may become asymptotically close to the OS but will never reach that point.

And note the price: on that way it will become asymptotically close to the size of OS.

emteycz wrote at 2020-10-28 15:08:18:

Soon CPUs and GPUs will be so fast and so efficient and so cheap that it doesn't matter.

c-smile wrote at 2020-10-28 17:33:28:

That's never ending story.

As soon as you will have better GPU you will have better native native games that use HW at maximum extent. They will work at the magnitude of times better than what WebGL emulation layer will offer you. So hi-end games will always be native.

ppf wrote at 2020-10-28 09:02:43:

When Google itself cannot keep up with the moving target it has created, then it will be complete. I call it the Singularity.

smrtinsert wrote at 2020-10-28 13:58:42:

As soon as operating systems are!

tutfbhuf wrote at 2020-10-28 07:28:23:

Never.

shams93 wrote at 2020-10-28 14:24:09:

Well for Safari the answer is never, like Apple have said no to any browser features that compete with their native apis so no web midi no web usb etc...

shaicoleman wrote at 2020-10-28 14:55:13:

Indeed, all these features are marked as "Not Considering"

https://webkit.org/status/

ForHackernews wrote at 2020-10-28 13:04:35:

As soon as Google has finished killing Mozilla and Safari?

api wrote at 2020-10-28 12:31:33:

Web browsers won't ever be complete. They will bloat until the standard is abandoned and something leaner replaces it, like everything else.

makach wrote at 2020-10-28 07:36:32:

It is only when they are no longer needed that you can state that they have been completed.

There will be no web browsers in the future!

What comes after web browsers? Only exciting times...!

peterwwillis wrote at 2020-10-28 11:01:24:

The browser becomes the OS and then that OS will get a browser.

Turtles.

Hurtak wrote at 2020-10-28 07:53:47:

"When will operating systems be complete?"

"When will car technology be complete?"

...

The answer is they wont be in short term, and they will be made obsolete in the long term.

yodelshady wrote at 2020-10-28 09:46:56:

Cars don't suffer feature creep. We've optimised them - removed the horse, switched to steel construction, added electrical subsystem - but really there's been no major functional change in cars, ever.

TeMPOraL wrote at 2020-10-28 13:36:37:

...added electrical subsystem, replaced combustion engine with batteries and electric motors, we're adding various degree of autonomy.

In parallel, we've replaced sane controls with frustrating and dangerous touchscreens, we've added countless types of comfort features, we've improved security, we've computerized engine controls, we've DRMed the car - ostensibly to prevent theft, but actually to route more money from aftermarket towards the manufacturers. Etc.

Cars do suffer feature creep, and not all of the features are beneficial to their owners. Just like with the web.

dwheeler wrote at 2020-10-28 12:10:29:

Um, what?

Our expectations about cars have definitely changed over time. I don't think it's even legal to sell new cars without seatbelts, something that did not exist when cars started. Even low end cars in the US have added features compared to low end cars of the past, such as power windows, power steering, power mirrors, and so on.

It's true that adding features is slower in cars, but that is because cars are partially hardware. It takes much longer to add features to hardware, because making copies is not free. But that does not mean they are unchanged.

rusticpenn wrote at 2020-10-28 12:24:32:

I am not sure if you are sarcastic.

rado wrote at 2020-10-28 07:23:20:

Wish they were stable first. There is undefined and buggy behavior, e.g. sub pixel scrolling, scroll container padding etc.

onion2k wrote at 2020-10-28 07:37:33:

_We will reach feature-parity with our local Operating Systems._

The author states this in a way that makes it sound like it's possible, but that betrays an assumption that operating systems aren't moving forwards and innovating as well. The web will always lag behind the new features that come to OSs, and therefore browsers will always need to be changing in order to give web users access to those features.

enos_feedler wrote at 2020-10-28 07:50:15:

Not only are operating systems moving forward in their own direction, they are picking up the missing pieces that were traditionally reserved for the web. The two most important being URLs for deep linking between apps (universal links) and installing/fetching without redirect to a download page (app clips).

At some point we will cut out the store publishing step for certain classes of same apps running in safe runtimes (wasm with some native bindings with safe data management).

Then what is the difference? You might say compatibility across platforms. But compatibility really is broken in browsers today. So....

BlueTemplar wrote at 2020-10-28 14:46:40:

One aspect is how IPv4's quirks (and Google's domination plans) have pushed the Internet to push more and more things over HTTP(S).

IPv6 will hopefully change that, and native programs will get another reprieve from WebApps.

(I also have to point out that WebApps still mostly suck - and are they even used yet on mobile/wearables/VR ?)

amelius wrote at 2020-10-28 09:28:12:

They will be complete when we realize that they can be better implemented as generic virtual machines.

nottorp wrote at 2020-10-28 14:53:17:

When it doesn't make business sense to embrace and extend any more.

fallat wrote at 2020-10-28 14:54:30:

I will take the replies and create a response post tonight.

anymouse2 wrote at 2020-10-28 08:52:25:

When Microsoft insists everyone is standard.

boltefnovor wrote at 2020-10-28 07:35:12:

Internet Explorer is finished.

anymouse2 wrote at 2020-10-28 08:50:56:

When Microsoft insists everybody is up to standard.

Ericson2314 wrote at 2020-10-28 14:12:46:

No professionally-written software is every complete, thanks to Conway's law and the need to have a job under capitalism.

Simple as that.

02020202 wrote at 2020-10-28 11:28:09:

i am still baffled that we haven't moved away from this old html thing and made new protocols and new ways to draw UI over internet already. html protocol and markup language was never designed to do what we're doing with it today to begin with. and don't get me started with javascript. i guess backwards compatibility is more important than solid architecture and advancing the technology.