Over-powered browsers --------------------- (preface: this entry took a few painful days to write, and I'm not terribly happy with it, but I've uploaded it anyway to keep the phlog alive. This is exactly the kind of lang, rambling thing I've tried to write in the past and given up on. It's precisely the thing this phlog is *not* supposed to be, but alas. Read on if you dare) In the beginning, web browsers did one thing and (presumably) did it rather well. They rendered HTML, in some primordial form, which is to say that they laid text and, very occasionally, images out on the user's screen. (aside: I suppose I'm basically assuming what browsers were like in the beginning. As best as I can recollect, I first used a web browser in 1995 or 1996. I was 10 or 11 years old at the time, and I don't know which browser I actually used. I do know that later in the 90s I was squarely a Netscape Navigator or Communicator user, but My First Browser eludes me. Which is frustrating, because it's possible I actually used the infamous Mosaic, but I'll never know) Nowadays web browsers are basically sandboxed program execution environments for running Javascript programs or doing various things which HTML 5 provides for, like streaming video, persisting data to the hard drive, etc. At this point you are presumably expecting me to burst forth into angry insistence that the way it used to be is the One True Way and that the current state of affairs is The Worst Thing Ever. And if you have a very good memory, you probably recall that in an earlier phlog entry ("Web apps are houses built on sand") I said some unequivocally positive things about web apps (i.e. Javasript programs) providing a cross-platform development environment, and are wondering if this is the entry in which I reveal myself to be a hypocrite. Well, not quite. Web apps *are* nice in a lot of ways, for many applications. My primary beef is not so much that the computational capacity of the browser as a platform has increased, but that this capacity is used indiscriminately and irresponsibly. Let's go back to our hypothetical (probably not too wrong) Primordial Browser that renders HTML 1.0 and puts text and graphics on a page with relatively minimal flair, and let's assume that it's running on a modern high speed internet connection. This browser has a lot of nice properties, but perhaps what it boils down to is a complete lack of surprise. When you click on a link in this browser, you can fairly well assume that: * Your CPU usage will increase a little, but not too much, for a fairly brief time (i.e. a few seconds). * Your RAM usage will do the same. * In general, the consequences of your clicking, in terms of resource consumption, will be minimal and shortlived. * It is incredibly unlikely that your browser will crash during this time. * The entity hosting the website you clicked a link to will learn nothing about you beyond the information your browser inserts into the headers of its HTTP request. If the browser has been written by smart people, is well tested and has been debugged, then the above are true even if the HTML of the page you clicked a link to has been written would really rather they were not true, or by people who do not know what they are doing. The narrow scope of the browser's capabilities presents a very small "attack surface" to either malicious or well-meaning but incompetent webmasters. Because there is no possibility of surprise, there is no requirement for trust. You can click around with abandon, because what's the worst that could happen? Compare this to the situation that actually obtains today. On a machine with a Core i5 or i7 CPU and multiple gigabytes of memory it is not only entirely possible but a *far from rare* occurrence for a website to cause CPU usage to spike to nearly 100% and stay there long enough for a laptop's fans to kick in at high speed. Memory consumption for long-running browser processes are measured in hundreds of megabytes if not gigabytes. Browsers can feel quite slugish and slow to respond to user input, and a single very busy website can bring a system to its knees, at least temporarily. Even if a website seems to be fast and responsive, the real truth is that you have no idea what it's actually doing. I recall reading at one point about a browser history sniffing exploit where a website could learn which other sites you had visited by virtue of the behaviour of browsers whereby visited links are coloured purple rather than blue - evidently Javascript provides the ability to interrogate the browser about the colour of various elements in the document. This is an example of a website harming a visitor using an unexpected consequence of a behaviour which there is no compelling reason to be possible in the first place. In short, there is tremendous scope for surprise and thus you are necessarily placing a great deal of trust in the entity behind every page you visit. I would argue that fast, resource-light and trust-free browsing is A Good Thing. Unfortunately, it is basically a relic of a bygone era, because these days 99% of websites are laden with superfluous Javascript and advaned HTML features, even if they are not at all strictly required to achieve the goal. In essence, I would argue that acting like a modern browser is necessary and acceptable for websites which are obviously applications -- which *need* to be applications in order to do what they say on the tin -- but for all other sites we should demand behaviour as was seen in simpler times. All well and good, but can we ever hope to actualy see this? Probably not, but indulge me. The web-design community has an excellent track record when it comes to convincing people to care deeply about what many might consider irrelevant, abstract, ideological nonsense. Just you try and use