I think a more radical departure is needed. It's about time to acknowledge that the web is increasingly being used to access full-blown applications more often than "webpages."
Web browsers have more or less become mini operating systems, running elaborate virtual machines. There's way too much complexity for everyone involved — from web devs and browser devs to the users and the people who maintain the standards, then there's the devs who have to make native clients for the web apps — just to deliver products that don't have half the power of OS-native software. Everyone has to keep reinventing the wheel, like with WebAssembly, to fix problems that don't have to be there in the first place, not anymore:
Thanks to smartphones, people are already familiar with the modern concept of the standalone app; why not just make downloading an OS-native binary as easy as typing in a web address, on every OS?
Say if I press Cmd+Space on a Mac and type "Facebook" in Spotlight, it immediately begins downloading the native OS X Facebook app. The UI would be described in a format that can be incrementally downloaded, so the experience remains identical to browsing the web, except with full access to the OS's features, like an icon in the Dock and notifications and everything.
TL;DR: Instead of investing resources in browser development, Android/iOS/OSX/Windows should just work on a better standard mechanism to deliver native apps instead.
This is a backward-looking argument which ignores the unique benefits of the web which made it inevitable that it would evolve into an application platform, regardless of how tortured the results may feel.
The web is the first truly cross-platform development environment. It is not controlled by a single vendor, and anyone implementing a new computing device must support the web (just stop for a second and consider from a historical perspective what a monumental accomplishment that is). Furthermore, it allows casual access of content and applications without any installation requirement. It comes with a reasonable security model baked-in, which, while imperfect, gets far more attention than most OS vendor sandboxing schemes. Last but not least, the web's primitive is a simple page, which is far more useful than an app as a primitive—for every app someone installs they probably visit 100 web pages for information that they would never consider installing an app for.
I agree that the web is sort of abused as an application platform, the problem is there is no central planning method which will achieve its benefits in more app-oriented fashion. No company has the power to create a standard for binary app deliverables that will have anywhere near the reach of the web. And even if one could consolidate the power and mastermind such a thing, I feel like it would run squarely into Gall's Law and have twice as many warts as the web.
"The web is the first truly cross-platform development environment."
No it isn't. Not even close. It's maybe the first cross-platform development "environment" of which millenials are widely aware. But it's only an "environment" in the most ecological sense -- it's a collection of ugly hacks, each building upon the other, with the sort of complexity and incomprehensibility and interdependency of organisms you'd expect to find in a dung heap.
"Last but not least, the web's primitive is a simple page, which is far more useful than an app as a primitive"
For whom, exactly? You're just begging the question.
I'll grant you that "installing" an app is more burdensome for users than browsing to a web page, but the amount of developer time spent (badly) shoe-horning UI development problems (that we solved in the 90s) into the "page" metaphor is mind-boggling. In retrospect, the Java applet approach seems like a missed opportunity.
The proper reaction to something like React, for example, should be shame, not pride. We've finally managed to kludge together something vaguely resembling the UI development platform we had in windows 3, but with less consistency, greater resource consumption, and at the expense of everything that made the web good in the first place. And for what reason? It's not as if these "pages" work as webpages anymore.
A proper "application development environment" for the web would be something that discards the page model entirely, and replaces it with a set of open components that resembles what we've had for decades in the world of desktop application development.
You are so right.
The web as an application delivery platform sucks.
And we are no even capable of producing UIs with the same level of polishment I did in visual basic 2.0 in 1996 or so.
Alan Kay has expressed the same filling.
PS to downvoters if you have not used a proper interface designer such a QT or Delphi, then you don't know what we mean. Please watch some videos to decide if the state of the art (angular and react) is what we should be using in 2016.
The downvotes are because he is not engaging with my point. Never did I say the web is a proper application development environment. My point was that you can't create a proper application development environment that is both an open and defacto standard the way the web is.
How am I not engaging with your point? In response to a comment pointing out how we need to re-think web application development, you said:
"The web is the first truly cross-platform development environment."
...and then talked a bit about how it's open (yeah, ok, sure), and then you said it's not really a good application development environment (obviously).
I'm saying, your entire premise is wrong: it isn't an application development environment, any more than a box of legos is a "housing development environment". People have built houses out of legos, but that doesn't make "lego" a building material. It's a big, messy, nasty hack.
The fact that it's "open" is a non-sequitur response to "it's the wrong tool for the job", which is what the OP (and I, and elviejo) are arguing. It's also not a legitimate response to argue that any re-thinking of the model has to come from a company, or otherwise not be "open".
The reason that web apps happened is because web apps started as a hack. That doesn't mean we can't change the paradigm, but to do that, we have to stop defending the current model.
(Realistically, the reason I'm getting downvoted probably has more to do with my willingness to call out React as a pile of garbage than with the substance of the greater argument. C'est la vie...it's actually pretty amusing to watch the comment fluctuate between -3 and +3...)
> That doesn't mean we can't change the paradigm, but to do that, we have to stop defending the current model.
Here's the crux of our disagreement. You believe that the web is such a broken application platform that it is possible to convince enough vendors and people to get behind a better solution. However, I (despite your presumptuous implication that I'm a millenial), have been around long enough to know that will never happen. Web standards will continue iterating, and companies will continue building apps on the web, even the most powerful app platforms today such as iOS and Android for all their market power can not stop this force. The reason is because it's a platform that works. The man-millenia behind the web can not be reproduced and focused into a single organized effort. You might as well argue that we replace Linux with Plan 9, it doesn't matter how much passion you have and how sound your technical argument is, Linux, like the web, is entrenched. It's gone beyond the agency of individual humans and organizations to become an emergent effect.
That's not to say that the web might not some be supplanted by something better, but it won't come because of angry engineers wringing their hands about how terrible the web is. It will come from something unexpected that solves a different problem, but in a much simpler and more elegant way, and over time it will be the thin edge of the wedge where it evolves and develops into a web killer.
Maybe I'm just cynical and lack vision, perhaps you can go start a movement to prove me wrong. I'll happily eat my hat and rejoice at your accomplishments when that time comes.
"You believe that the web is such a broken application platform that it is possible to convince enough vendors and people to get behind a better solution. However, I...have been around long enough to know that will never happen."
"That's not to say that the web might not some be supplanted by something better..."
Whomever wrote the first paragraph of your comment should get in touch with the person who wrote the second paragraph.
OK, seriously, though, let's summarize:
1) Person says "web development sucks, here's why: $REASONS"
2) You reply: "it's the only truly cross-platform development environment ever"
3) I (and others) reply: "no, it really isn't. it isn't even a development environment, by any reasonable measure."
Now you're putting words in my mouth about convincing vendors and starting movements. I'm not trying to start a revolution here, just trying to counter the notion that we can't do any better than the pile of junk we've adopted. You don't have to love your captors!
I have no idea if someone will come up with a revolutionary, grand unified solution tomorrow, but I know that this process starts with the acknowledgement that what we have sucks, and that we have lots of examples of better solutions to work from. Hell...just having a well-defined set of 1995-era UI components defined as a standard would be a quantum leap forward in terms of application development.
The irony is I understand your qualitative opinion of the web, and I generally agree with it. What I believe makes you unable to see my argument is an inability to separate technical excellence from the market dynamics that govern adoption.
Declaring the web "not even a development environment" is just absolutist rhetoric that can in no way further the conversation. If you define "development environment" as a traditional GUI toolkit then your're just creating a tautology to satisfy your own outrage.
This is a great discussion. What is it about the English language that makes it so much easier to oppose someone than express nuances in general opinion? I would like to see more discussions like this based at implementation level, surely something valuable and innovative is being grasped at by both sides.
Lets be honest, the web is a developer environment in the same way that a paper aeroplane is a passenger plane. I mean I'm sure its possible to create a 747 from paper, but do you really want to?
Why is it that a new javascript framework pops up each week? Its because the web as a developer environment is deficient. Despite it being standardised so much stuff doesn't work without kludges in each browser
> It will come from something unexpected that solves a different problem, but in a much simpler and more elegant way, and over time it will be the thin edge of the wedge where it evolves and develops into a web killer.
So.. app stores?
It has already begun. The most popular webapps (Facebook, Twitter etc.) already have native clients in Android and iOS. I believe the majority of people already prefer and use the native FB/Twitter apps more often than accessing the FB/Twitter websites. So it's already obvious that native apps must be more convenient.
Right now however, app stores are a little clumsier to navigate compared to browsers.
For webapps:
• you have to open the browser,
• type in the address OR
• use a web search if you don't know the exact address.
But for apps:
• you have to open the app store,
• search for the app,
• potentially filter through unofficial third-party software,
• download the app, possibly after entering your credentials,
• navigate to the app icon,
• authorize any security permissions on startup (in the case of Android or badly-designed iOS apps.)
We just need the Big Three (Apple/Google/Microsoft) to actively acknowledge that app stores can supplant the-web-as-application-platform, and remove some of those hurdles.
Ideally an app store would be akin to searching for a website on Google.com (or duckduckgo.com) with a maximum of one extra click or tap between you and the app.
Apps should also be incrementally downloadable so they're immediately available for use just like a website, and Apple already has begun taking steps toward that with App Thinning.
Ultimately there's no reason why the OS and native apps shouldn't behave just like a web browser, because if web browsers keep advancing and evolving they WILL eventually become the OS, and the end result will be the same to what I'm suggesting anyway.
Currently though, both the native OS side and the web side exist in a state of neither-here-nor-there, considering how most people actually use their devices.
I'm in the middle of reading through The Unix-Haters Handbook and I must say that the arguments against web-as-application-platform (and the attempts to defend it) and are eerily similar to what this book reports about arguments against Unix that were circulated in the 80s and early 90s. Sadly, the fact Unix managed to a) win, and b) fuck up the computing world so badly that people don't even realize how much we've lost doesn't make me hopeful about the future of the web.
Bad stuff seems to win because it's more evolutionarily adapted than well thought out stuff. This happens to hold for programming languages too.
Your takeaway of computers going from niche industry to the single largest driver of global economic activity is that the bad stuff won? What an incredibly myopic conclusion.
The recent cross-communication between JavaScript, Elm, and Clojure has been incredibly fruitful but hasn't been noticed by the bitter die-hards. And really, almost all of it could've happened literally 15 years ago with Lisp if the Lisp community hadn't been dismissive, arrogant douchebags that considered JavaScript a worthless toy language.
What's truly sad is that some people would rather be abstractly right while producing nothing of value than work with the dominant paradigm and introduce useful concepts to it.
> Your takeaway of computers going from niche industry to the single largest driver of global economic activity is that the bad stuff won? What an incredibly myopic conclusion.
This did not happen thanks to Unix; if anything, you'd probably have to be grateful to Microsoft and Apple for introducing OSes that were end-user-usable. There's a reason the "year of Linux on Desktop" never happened and is always one year from now.
The point of The Unix-Haters Handbook, which also applies very much to modern web is that the so-called "advancement" didn't really bring anything new. It reinvented old things - things we knew how to do right - but in a broken way, full of half-assed hacks that got fossilized because everything else depends on it.
> And really, almost all of it could've happened literally 15 years ago with Lisp if the Lisp community hadn't been dismissive, arrogant douchebags that considered JavaScript a worthless toy language.
I don't know where you're getting that from, but it's probably a good opportunity to remind you that JavaScript was supposed to be Scheme twice, both time it didn't happen because Netscape wanted a Java-looking solution right fucking now to compete first with Java, and then with Microsoft, and somehow no-one thought to pause for the moment and maybe do it right.
(Also don't blame Lisp community for the fact that companies reinvented half of Lisp in XML. Rather ask yourself why most programmers think the history of programming is a linear progression of power from Assembler and C, and why they remain ignorant of anything that happened before ~1985.)
JavaScript got a bad rep because a) it was terribly broken (less so now), and b) because of all the stupid stuff people were writing in it those 15 years ago. But the current problems of the Web are not really the fault of JavaScript, but of the community moving forward at the speed of typing, without stopping for a second and thinking if those layers on layers on layers of complexity are actually needed or useful. Simple landing pages are now built on frameworks that are more complex than what we used to call "Enterprise Edition" 10 years ago.
Unix isn't for end users, it's for developers to build on top of to give things to end users. Linux on desktop already happened years and years ago if you work for a tech company, and that's probably about as far as it needs to go.
This is Steve Yegge's understanding of the Lisp community, and I should clarify that I don't think the XML monstrosities we all work with are "all their fault", but that, on the whole, the Lisp community and enterprise coders were mutually antagonistic.
Javascript after ES3 really wasn't broken at all, just most people coding in it didn't know how to take advantage of it. No language can prevent someone dedicated to bad code architecture from writing bad code. A Lisp programmer would've found a lot of comfortable features and powerful patterns and been able to share them, but most of their efforts were wasted denouncing everyone outside of their tiny sect. The end result was that most people learning JavaScript were taught how to code like it was broken Java because most of the resources were written by Enterprise Java devs who didn't understand what a fundamentally impoverished language Java is.
Thanks for the link to that post. I'd also advise to read through its comment though - some people there, especially Pascal Costanza, point out quite a lot of problems that basically reduce it to ranting of a person who doesn't understand the language and the culture he's writing about ;).
Also, the influx of enterprise patterns into JavaScript is quite a recent phenomenon - personally, I blame Google (who, for a reason I can't understand to this day, embraces enterprise-level Java as their primary platform for everything...), but regardless, the problem with JavaScript culture is mostly that of very fast growth coupled with lack of experience and (probably unwilling) ignorance of the past. Since this community basically controls the Internet, it's hard for voices expressing some restraint and thoughtfulness to get trough the noise.
And I really do recommend The Unix-Haters Handbook. Funny thing is - over a decade ago, when I was acquainting myself with the Linux world (after many years of DOS and Windows experience), I've been noticing and complaining about various things that felt wrong or even asinine. Gradually I got convinced by people I considered smarter than me that those things are not bugs but features, they're how a Good Operating System works, etc. Only now I realize that my intuition back then was right, but I got Stockolm-syndromed to accept the insanity. Like most of the world. The sad thing is, there were better solutions in the past, which once again shows how IT is probably the only industry that's totally ignorant of its own history and constantly running in circles.
What part of UHH do you think is not obsolete? I dislike many aspects of Unix, but it seems like 90% of the book is either wrong, meaningless invective, or obsolete. It's hard to view it as of more than historical value -- which is a shame, because Unix is far from perfect.
To clarify, I'm about half of the way through the book, and I have found a couple arguably-correct points. Complaints about Usenet and Unix vendors have mostly gone the way of the dodo. The author consistently ignores any distinction between the filesystem and the rest of the OS, which may have been accurate at the time of writing, but hasn't been true for decades. Similarly, we don't distinguish between a given shell and Unix as a whole, even though the book makes explicit mention that other shells exist. And why there is a chapter on Usenet passes understanding. Suggesting that shell expansion be handled by an external library is equally bizarre.
So far the valid points are:
* command input syntax is inconsistent. I don't know that this has a feasible solution, but it is true.
* tar specifically sucks
* sendmail configuration sucks
The real crime of UHH is that it merely hates, it does not instruct. When we do find valid criticisms, there is no suggestion for how to fix things, or how other OSes are better at the same role. I've resigned myself to read the entirety, but for all the authors' complaints about not learning anything from history, one can only feel like they have themselves to blame.
If you know C++/Perl/PHP but don't know ES6, websockets, Node, etc, would you not say that your opinion about the web being a bad environment for rich applications might be in some way colored by your personal economic interests?
A common defensive mechanism among people with outdated skills is to try to delegitimize new frameworks and technologies in the hopes of convincing the broader community not to use things they don't know.
I'm inferring this from your arguments being driven by analogies and insinuations rather than concrete critique. It's not my intention to attack you personally, but an aggressively dismissive attitude towards unfamiliar concepts should be properly contextualized.
As for React, isn't it more likely that you don't know React very well, have never looked at its internals, and in general don't feel like you have the time or ability to learn much about modern web development?
If you build a few projects with React and still dislike it, good! Your critiques will be a lot more valid and useful at that point, whereas right now...yeah.
I know C++, used to know Perl, and know JavaScript pretty well. The fact that you actually name WebSockets as a technology sort of sums up the issue we are discussing here. WebSockets is not something that can be compared to C++ or even Node. It's a dumb hack which justifies its existence primarily by allowing web apps to circumvent corporate firewall policies.
The web is a joke of an app platform. Those of us who have wider experience of different kinds of programmings see some web devs struggling with this concept and conclude, I think quite reasonably, that the only plausible explanation is lack of experience. This is not due to "outdated skills" - I daresay everyone criticising the web as a platform in this thread has, in fact, written web apps. It's the opposite problem. It's to do with developers who haven't got the experience of older technologies having nothing to compare it too, so "web 2.2" gets compared to compared to "web 2.0" and it looks like progress.
And in case you're tempted to dismiss me too, I recently tried to write an app using Polymer. It sucked. The entire experience was godawful from beginning to end. Luckily the app in question didn't have to be a web app, so I started over with a real widget toolkit and got results that were significantly better in half the time.
I want to disagree with you more here, but Polymer really does suck.
I would be interested in a detailed explanation of why websockets are a "dumb hack." Duplex streams much more closely map to what web apps actually need to do than dealing with an HTTP request-response cycle. In what way is streaming a hack and making requests that were originally designed to serve up new pages not a hack?
> My point was that you can't create a proper application
> development environment that is both an open and defacto
> standard the way the web is.
Why? Is there a technical limitation? Or are you saying that it's technically possible, but that you have no faith that vendors will cooperate and support such a thing?
>My point was that you can't create a proper application development environment that is both an open and defacto standard the way the web is.
In addition to the countless examples posted in this thread, I would argue that if it's nearly impossible to create your own implementation of a platform or standard from scratch, then it's not really open in a practical sense. Who cares if the specs are available if it takes dozens or hundreds of man years to deliver a passable implementation?
The web makes Java look clean and elegant. Which is saying something. But it's an open question whether anything as popular as the web could have been any less crappy. CPU cycles and memory are way cheaper than social cooperation.
Those aren't cross-platform the way the web is cross-platform, they all depend primarily on one vendor's implementation. None of those ever had any hope of crossing the chasm to ubiquity, they only gained as much traction as their vendor had market clout on a limited set of platforms for a limited time period.
"Those aren't cross-platform the way the web is cross-platform, they all depend primarily on one vendor's implementation"
And web browsers are different how, exactly?
(Other than the fact that "the web" is a mish-mash of hundreds of different "standards" with varying levels of mutual compatibility and standardization, of course.)
Web browsers have standards they're supposed to meet. Where's the standard for qt? Where's the competing implementation? You can denigrate the web standards, even with some degree of truth, but you yourself have pointed out the difference. Perhaps instead of being dismissive you can elaborate how this difference is ultimately valueless despite its apparent success.
Both the JVM and the CLR have multiple implementations. Both JVM and CLR have a standard, as do Java and C#. The primarily implementation for both is also open source. So no, they don't primarily depend on one vendor's implementation.
> But it's only an "environment" in the most ecological sense -- it's a collection of ugly hacks, each building upon the other, with the sort of complexity and incomprehensibility and interdependency of organisms you'd expect to find in a dung heap.
That's what every practical environment is. Only the environments which never get used remain "pristine" in the architectural sense, because you cannot fundamentally re-architect an ecology once you have actual users beyond the development team and a few toy test loads.
Given how fast JS world jumps between frameworks I'm thinking that yes, a strong enough actor could in fact re-architect the Web, or at least clear out some of that dung heap...
I had this discussion the other day with a junior engineer in regards to web assembly and the need for more stable compile targets than javascript.
I think a big part of the problem is that web developers have forgotten (or never learned) about a lot of the ui innovation that has already been done for native platform development.
I blame the page / html / dom model for this. It has forced generations of web developers to figure out clever (or not) workarounds to the point that they actually think they are innovating when they arrive at the point qt was at years ago.
> Android/iOS/OSX/Windows should just work on a better standard mechanism to deliver native apps instead
one might imagine that after these competing and incompatible native apps become a headache for crossplatform pursuits, a new platform will emerge that provides a uniform toolset for developing (mostly) native-platform independent applications.
perhaps this toolset will utilize a declarative system for specifying the user interface, and a scripting system that is JIT'd on each platform.
> I think you're on to something...it could be huge... Heh.
Could be. Sad that it isn't. Think how awesome it would be if app developers actually cared about interoperability instead of trying to grab the whole pie for themselves while giving you a hefty dose of ads in return. This is mostly the fault of developers, but the platform itself could help a lot if it was more end-user programmable. You'd have at least a chance to force different apps to talk to each other.
Yes, I know. And I'm trying to subtly point out that it doesn't even work on the web, because it got fucked up by cowboy companies who ignore standards and do whatever they like to get easier $$$.
Android has declarative UI, JIT compiled app logic and a way to link apps together (via intents). It is definitely not the web though.
I think people confuse ideas with implementations. The web is a pretty reasonable implementation of the idea "let's build a hypertext platform". It is not at all a reasonable implementation of the idea "let's build an app platform" which is why in the markets where reasonable distribution platforms exist (mobile) HTML5 got its ass kicked by professionally designed platforms like iOS and Android.
Well, why do native apps for all the popular websites exist? Surely nobody would need or download the FB/Twitter/Reddit/etc. apps on mobile or desktop if the website itself was the optimum experience...
My point is that even with the web the developers are still going to make the native clients, so either the web has to become good enough for the need for native apps to disappear eventually, and the browser becomes the OS, or native apps become convenient enough to completely replace webapps.
Of course if the browser becomes the OS then the end result would be the same as the suggestion in my original post.
Interestingly, this was what Alan Kay was advocating for on the web: "A bytecode VM og something like X for graphics" [1].
When Lars Bak and Kasper Lund launched Dart [2], I found it sad that they weren't more bold - left CSS and the DOM alone, and created an alternative Content-Type. So you can choose to Accept 'application/magic-bytecode' [3] before text/html, if your client supports so. Sadly, we ended up with Web Assembly, which by the few talks I've seen, appears to only cater to that of graphic/game developers, with no support for dynamic or OO languages.
Yes I think of dart as a missed opportunity, it isn't smalltalk for the web, neither is it strongly typed... I think this falt of character makes it that nobody hate it, but also no body loves it.
Go doesn't have generics, some hate it, some love it. But it took a strong stand on that point.
>Web browsers have more or less become mini operating systems
I wish. No, web browsers have become massively bloated operating systems. And since they didn't intend to, they are terrible at it. You have little to no control over anything.
Have you looked at the size of native apps these days? They make the web look almost unbloated by comparison. (It feels like half of them are shipping an entire HTML rendering engine too.)
Of course webpages would still be around and needed, and desired for more basic content, but by the time you want to offer something as complex or regularly used as Facebook or Twitter or Reddit or Soundcloud and so on, it'd be better as a native app, as the current native clients for each of those websites already prove.
I mean the UI is undeniably smoother, and they can seamlessly hook into the OS notifications systems and better multitasking (for example I see separate entries for each native app in the OS's task switcher, but have to go through the extra step of switching into a browser then its tabs for webapps) and energy saving and everything else.
> regularly used as Facebook or Twitter or Reddit or Soundcloud
And there's the problem. I don't use any of those four sites regularly, but I have visited all of them. Hyperlinks provide for that, and they (a) don't exist or at best would be awkward in a native app (not that webapps handle them well to begin with, though all of the above do allow for them at least) and (b) work between apps, platforms, and what-have-you. If I got a link to some image, <160 character sentence, comment thread, or song and was prompted to download an app, I would probably not view that content instead.
That sounds ambitious, interesting, and like a lot more work than keeping a webpage running. My first thought is "cross-platform nightmare". My second thought is that they'd have to rethink their expectations of continuous deployment/release and a/b testing.
We're either looking at making individual pages maintain binaries for the platforms they support (implying support of only those platforms that make sense to the site) or some kind of compilation framework running on the local machine.
Firefox and Chrome don't do "move fast and break things"-style continuous deployment, can't push updates to a user seamlessly (i.e. without restarting the program), and can be downloaded and compiled on platforms that Mozilla and Google don't want to officially support.
Native delivery of a monolithic browser based on an open-source codebase is a fixed problem. Trying to do the same with a website, using current techniques, would cause issues to both their current workflow and my expectations as a user that websites don't currently have.
I'm not saying that it's impossible to do, I'm just saying that it's not a good fit for current trends in web development, and I'm not convinced that it would be great for the users either.
I also agree that the web is full of bloat and a mishmash of compromised features. I think you can design a web thats more powerful yet simpler if you actually solve the problems in a more forward thinking manner.
For example there has been so many things for powerful layouts whereas everyone knew 10 years ago we need powerful layout solutions (flexbox or whatver) and now we have grid frameworks and years of craft on older css enhancement that has to be supported. They keep adding features here and there to sort of address lots of problems where individually those features might be cheaper but the overall cost of implementing both by browsers and the us regular developers is much higher.
Here are couple of the things I want from the web and quite a few of them are there already if not in super ideal forms. Powerful layout thats simple enough to use, concept of webpage, a bundle (http2? all your resource together), making the whole partial rendering (ajaxified page) a natural concept. Even making the UI/markup delivery being made separate from content (you can do that with all sorts of library but I think it should be at the core). Security concepts that are easier to implement (CSRF, url tampering etc.).
One of the idea I had is that browsers make a new engine that does the right things from the start and hopefully thats a much lighter engine and if you serve new pages they are really fast and if you serve old pages there is an optional transpoiler kind of thing that translates to the new version of the fly. Now it won't be terribly good to start with so its optional but essentially the old version is frozen and the more people start to only use the new engine (with transpoiler).
Not sure about the conclusion but you're absolutely right about the problem. This is why the social networks are eating such a great percentage of screen time: pure content, outside of web applications, is better suited in a single format rather than the superfluous loud magazines that currently pepper the web.
Perhaps rather than native apps what we need is the return of gopher. I think that's what Apple's trying to do with Apple News.
That's a good point. Too much control over the form is left with website owners and too little with web users. Most of the web looks much better when you strip out the styling it comes with.
In a way, this is why I like doing more and more things from inside Emacs. I get a consistent interface that I control, and that is much more powerful than what each app or website I'd otherwise use can offer. Hell, it's a better interface than any popular OS has.
I'm fine with that.. however, I've seen MANY instances of websites that load multiple versions of jQuery... that's just one library. Let alone how poorly constructed most web applications are.
When it comes down to it, for so long many front-end guys only cared about how it looked, and backend guys didn't care at all, because it's just the ui, not the real code.
We're finally at a point where real front end development is starting to matter. I honestly haven't seen this much before about 3-5 years ago... which coincides with node and npm taking over a lot of mindshare. There's still a lot of bad, but as the graph show, there's room to make it better.
> It's about time to acknowledge that the web is increasingly being used to access full-blown applications more often than "webpages."
I think that is orthogonal to bloat. Sure, a complex app will always have more to load and compute than a static page with one blog post on it, but that doesn't mean an app can't be bloated on top of that, just like pages with just a single blog post on them can be bloated.
That's not really bloat, because anybody who sees your comment will already have that link in the browser cache ^_^
(To make this comment not entirely frivolous, does anyone remember the "bloatware hall of shame", or however it was called? I couldn't find it or anything decent like it, sadly. How about something like it for websites?)
> Thanks to smartphones, people are already familiar with the modern concept of the standalone app; why not just make downloading an OS-native binary as easy as typing in a web address, on every OS?
What constitutes "way too much complexity" ?
What if browsers are evolving with whatever resources are available to them at just the right pace. Why would you want the browser to be hindered by an arbitrary speed/resource limit. Let it soar to the sound of fans going full speed!
In my opinion, if there is something that the web can't do as well as native it is a bug.
Web technologies can already do most of what you are proposing, including notifications. There are some performance issues, but they are well on their way to being fixed.
Thing is, those performance issues matter. In user interfaces, they mean a lot. And they're also important when you're trying to do actual work. I'm yet to see a web app that wouldn't choke on the amount of data a power user may need. Yes, that includes Google's office suite, which is so horribly laggy when you put more than a screenfull of content into it that the very experience justifies switching to Windows and buying Microsoft Office.
What we would need, if the browser is to become a platform for actual productivity tools and not shiny toys, is a decent persistent storage interface - one that would be controlled by users, not by applications, and that could be browsed, monitored. And most importantly, one that would be reliable. And then, on top of that, a stronger split between what's on-line and what's off-line. Because some data and some tasks should really not be done through a network.
The problem with the web is not one specific missing feature, or even performance. Man-centuries of effort by browser makers have been able to make performance not-quite-competitive instead of just hopelessly uncompetitive.
The problem with the web is that the developer experience is nightmarish. The fact that native apps don't suffer XSS should be a hint about where to start looking, but it's really just a house of horrors in there.
Web browsers have more or less become mini operating systems, running elaborate virtual machines. There's way too much complexity for everyone involved — from web devs and browser devs to the users and the people who maintain the standards, then there's the devs who have to make native clients for the web apps — just to deliver products that don't have half the power of OS-native software. Everyone has to keep reinventing the wheel, like with WebAssembly, to fix problems that don't have to be there in the first place, not anymore:
Thanks to smartphones, people are already familiar with the modern concept of the standalone app; why not just make downloading an OS-native binary as easy as typing in a web address, on every OS?
Say if I press Cmd+Space on a Mac and type "Facebook" in Spotlight, it immediately begins downloading the native OS X Facebook app. The UI would be described in a format that can be incrementally downloaded, so the experience remains identical to browsing the web, except with full access to the OS's features, like an icon in the Dock and notifications and everything.
TL;DR: Instead of investing resources in browser development, Android/iOS/OSX/Windows should just work on a better standard mechanism to deliver native apps instead.