At this rate, I predict that about ten years from now, they'll just ditch the bloody memory-leaking browser and just ship out a VM and a language that will have a small GUI library with a DOM-like organization in the standard library compiling for it.
Sure, there are going to be nitpickers who will point out that we had that in 2005 and it was called Java, but what do they know about innovation.
I'm aware you're being satirical but it's an interesting argument which I discussed with a friend yesterday.
Java-in-the-browser wasn't a terrible idea per se. Some aspects of it were poor but the tech landscape back then was very different both in capability and in politics.
Indeed. The JVM-in-applet downsides (slow startup, memory hog due to fixed heap, unresizable, stuck in its box) were mostly contingent rather than necessary. There's no reason those mistakes have to be made again.
But lets discuss further, we also should admit the reality, not just reject with satirical statements.
Java didn't dominate the web front-end technologies. It's worth to consider why. The points I see:
- user needs to download and install JRE
- bad integration with HTML document,
applet is just a rectangular self contained box
on the page; and it's very inconvenient to
script DOM nodes, handle events from applets
- browser vendors (especially microsoft)
were against java becoming the dominating platform
- in 2005 java was the only language running on JVM (unlike today).
BTW, I wouldn't say java was very slow to startup.
In the '90s, I despised Java and kept applets turned off because they took forever to download, and once downloaded, would make my machine freeze for a few minutes while the JVM started up.
It's not so much of a problem now that we all have broadband and fast CPUs, but in the '90s it was enough for me to bin Java as "a slow piece of shit" and avoid it like the plague. That probably contributed to it not taking off as a client-side web platform.
My point was that, eventually, it won't even be a web frontend. They'll just get the web browser out of the equation. Of course, a web "widget" is still going to be there, it just won't be a program you open.
I do believe Java failed mostly due to politics -- and more precisely, due to this:
> browser vendors (especially microsoft) were against java becoming the dominating platform
Everyone agreed, in principle, that a portable, high-performance VM was what we needed. The problem was that every vendor insisted it had to be theirs, while ever so slightly sabotaging other vendors.
In the meantime, they all had to provide a working web browser.
Well, probably you are right. In this case it is the main lesson to learn - if we want something to happen, we must think how to "hack" the social system, how to refactor the political/social situation. Technically a common platform for applications is not a difficult problem. BTW, I do not blame Microsoft more than others; as you say, every vendor tried to sabotage others, including Sun who didn't suggest a solution sufficiently beneficial, or unavoidable, for everyone.
Java's actually a lot older than that, 1995 I believe. Though I'm not sure when Java applets became available in browsers (certainly before 2005). It's a sobering thought that whilst (in my opinion) Java applets were a very good idea, the implementation left much to be desired and it's going to take us more than 20 years to turn those ideas into a good workable solution (of which WebAssembly may be a part).
Unfortunately, every other web designer today seems to think I don't want to read a bloody article, but rather to be engaged by an interactive article-reading application that's basically impossible to distinguish from native applications, except for those eighty quirks that are definitely going to be solved by morehacks.js and those new CSS perversions.
Web browser developers seem to cater towards those needs, which is how we ended up with browsers where I can run fifty gazillion floating-point instructions per second in JavaScript but it takes me five seconds to find a bookmark, three of which are spent hovering over the titlebar until I remember there's no menubar anymore.
I surf with JavaScript disabled by default and selectively enabled for a few frequently used sites that directly benefit from it. I rarely find sites that are unreadable without JS--certainly less often than I used to experience sites that were unreadable because of it. On the rare occasions I find a site that wont work without JS (most common symptom:completely blank page) my decision more often than not is to close the tab and move on with my life. I don't think I'm missing out on much and my computer's fans no longer scream constantly when my machine is idle with the usual dozens of open tabs.
I've run into several sites that have weird issues without Javascript, but they seem to always be things that could have been implemented with traditional markup: missing form components, misplaced images, things like that. I'd say about 50 percent of the time[1], however, media-focused sites with complicated image-viewing "galleries" or a more obscure video player, are totally useless without it.
It can be frustrating to have to go through this process of navigating to a site, realizing I've broken it, and then reloading with all the crap turned back on, but yeah, like you said, it's better than having my CPU revved up just to have those "SIGN UP FOR OUR NEWSLETTERS!" modals flying around the screen.
That's because the designers of most web pages want the features they provide. When you say "optional" you don't actually mean optional, you mean removed. Such a thing already exists, it's called Gopher[1].
Most websites today don't even gracefully degrade. One of the trendy blog/article sites that gets posted here regularly (it might be Medium) is just a column of text with lots of whitespace, but the text is loaded via AJAX, so without JS, you can't even read it.
HA! That would actually be lovely. Something easily-discovered, like a "isactualhypertextnotaturingtarpit" attribute for the <meta> tag (OK, maybe something less verbose) would probably solve half the problem.
Java has survived these 3 points. I think that Java is an excellent language that is dying because Oracle has not handled correctly security issues and because Oracle has a very bad reputation.
Security ought to be the strong point of Java, not its weak point.
It survived but it didn't dominate the way that was expected. In the late '90s, Java was positioned and expected to be in the place that webapps live today - that is, something within your browser that would handle interactive live client/server applications.
If you are kidding (I just can't decide), be aware that Microsoft killed Netscape at the 90's because Netscape was starting to sell exactly that VM.
When Netscape became Mozilla, it shipped with XUL runner, that is exactly that, but nobody used it.
Firefox OS is that, again, sold for smartphones.
If they come out with a good VM (measured by the languages it can run), and a good DOM-like organization, it will happen in no time. But if we get another XUL, it just won't happen.
| When Netscape became Mozilla, it shipped with XUL runner, that is exactly that, but nobody used it.
i would hazard to disagree with that statement - the company i was working for at the time shipped thousands of devices our with the XUL runner, and i wrote several XPCOM objects to support the custom hardware we were shipping at the time.
i still miss some of the niceties that came from that.
I think you'll be disappointing. I predict that the browser in its current form will not be going away, no matter how much some people may wish for it.
Sure, there are going to be nitpickers who will point out that we had that in 2005 and it was called Java, but what do they know about innovation.