Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Money and Javascript.

The vast majority of people only visit websites that are other people's jobs. Popular websites now are those that provide legal ways of moving money. It's why no matter how good an open social network can be it'll only ever attract a fractional percentage of people(1) because KYC prevents any non-incorporated/institutional group from moving money. (1: ie, mastodon's 1 million, IRC's ~600k, etc).

And perhaps more importantly: the growth of HTTP as a javascript application delivery protocol instead of a way to access HTML pages on websites. The shift to browsers being fully fledged VMs runing real applications with lots of bare metal access means the utmost of security is required at all times (HTTP/3 can't even make a non-CA TLS connection). That security mindset makes for a lot more friction for humans trying to play around and share things with each other.



Had me until the end. I haven’t seen the friction? Let’s Encrypt and web 1.0 still work and domains are cheap. OpenBSD’s httpd is great for this job. You’ll exclude less people if you don’t require any features. You can still give your site out on big platforms or use a QR code. Users of open social networks can directly transfer value now, with KYC at the on and off-ramps. Telegram was closest to implementing it into a very large and permissive network with open clients, albeit centralized and closed servers, though we’ll see how Durov’s French situation develops now.


Try sending a link at your obscure dot com domain to a large group chat. Now try hosting the same material on a well known corporation's website. I'm certain that a lot more people are going to be visiting the corporate hosted URL. Why? Perceived security. People are terrified of personal websites because of the reasoning I gave above. Automatic powerful javascript execution has killed casual web surfing. Now opening a website URL is like running an application rather than reading a document.

As for the friction on the hosting side, the problem mostly has to due with keeping websites up, not setting them up. The short lifespan and fragility of the required CA TLS means any HTTPS only (because JS auto-exec) site will only survive for a few years without active human mantainence. Weather the acme(2) tool breaks, an LE root cert expires (like what happened this summer), acme version depreceates, host OS openssl version doesn't have cross support for required modern cyphers, or something in the 90 day cert lease cycle just breaks because it's so complex with so many things moving. CA TLS sites die. HTTP sites can last forever without being touched. HTTP+HTTPS should be the way, but with the security required for an auto-executing JS browser no one wants to risk HTTP. I've literally had people balk at loading a http://example.com/image.jpg because it was HTTP. The fear is not rationally evaluated on a case by case basis, it's just all security all the time no matter what habit now.


> Let’s Encrypt and web 1.0 still work and domains are cheap.

First the newest version of the protocol stops supporting something. Then, over time, most things switch to the new version, and the old version becomes unsupported.

It's a notable change when the new version doesn't support something that the old one does, because that thing is probably going away.

Let's Encrypt solves a lot of this but not all of it. In particular, it makes it harder for people to screw around starting out because you can't even send the link to your mom until you buy a domain and learn how DNS and Let's Encrypt work etc.

> You can still give your site out on big platforms or use a QR code.

The big platforms don't like to encourage their competitors. Links to off-site content are not likely to be promoted by algorithms.

QR codes imply that you already have access to a large existing network of people in meatspace, which has never been true for most people.

> Users of open social networks can directly transfer value now, with KYC at the on and off-ramps.

How does that work? The internet is global but anyone without a first world bank account is stuffed. Even when everyone is in the US they can't send even trivial amounts of money to each other without an incompetent/predatory corporation acting as an intermediary.

Meanwhile the theory also doesn't work, because "KYC at the on and off-ramps" is pretty meaningless for any system popular enough for its internal credits to be a de facto currency. But KYC has never really worked anyway. It has, however, caused a lot of trouble for innocent people who just want to be able to transfer small amounts of money for ordinary purchases without being subject to warrantless mass surveillance and the caprice of infuriating bureaucracies.

These are real problems that deserve to be solved rather than dismissed.


> Let's Encrypt solves a lot of this but not all of it. In particular, it makes it harder for people to screw around starting out because you can't even send the link to your mom until you buy a domain and learn how DNS and Let's Encrypt work etc.

Were you ever going to send a link to a bare IP address to your mom?

You can get a free subdomain and programs will automate the certificates as they serve off your desktop. The hardest part of self-hosting is often doing the port forwarding.


> Were you ever going to send a link to a bare IP address to your mom?

When you're in the same house and it's the local IP of your machine? Sure. You could also use the local machine name via mDNS, often with no additional configuration.


If it's in the same house you can turn off the warnings. Though wait, aren't the warnings already disabled for local IPs?


That's assuming you can (and know how to) turn off the warnings.

And many browsers do warn for self-signed certificates even on local IPs. They may not warn for unencrypted connections -- which is a weird choice given that TLS with a self-signed certificate is still more secure (e.g. against passive eavesdroppers) than unencrypted HTTP -- but HTTP/3 doesn't support unencrypted connections or self-signed certificates.


My argument is just that you can still screw around easily. I'm not worried about HTTP/1 going away soon.


Sometimes it's worth considering what the long-term implications of something are.

HTTP/3 is likely to become widely adopted over time, if for no other reason than that people install software updates and it becomes the default once popular browsers and web servers add support. People may even like the new features.

Then we get some new security vulnerabilities that get fixed in HTTP/3 but not older versions, and by then only a minority of sites use the older versions, so they get a warning. That spurs most of the holdouts to switch to the new version because they can't have scary browser warnings driving users away from their site. Which in turn allows the older versions to be fully deprecated and ultimately removed. And that's more likely here because the code for handling TCP and UDP are quite different and people aren't going to want to maintain the former if hardly anybody is using it.

It'll be years before that happens, but if a problem is foreseeable then maybe we should demand a solution contemporaneously with the creation of the problem instead of foisting it on the kids.


The current trend as far as I can tell is that it's easier than it used to be to host a web server. So that also has long term implications.

It's really tricky to predict exactly what web servers will do, but I don't see any reason to think it well ever be hard to set one up. You're fixating on a very specific warning and the way you're extrapolating into the future assumes that half the software changes but the other half doesn't. It just doesn't work well to predict things. And we already have solutions, they're just used less because HTTP/1 is so easy.

Another thing to consider is that the number of household devices that need access is growing rapidly. We're not going to break them all.


Blaming JavaScript for the state of the web is like blaming money for greed. JavaScript is only the medium, and the driving force behind it would have created a different language if needed. This take seems too narrow.


There's a clear argument to be made that technical capability circumscribes commercial possibility.

Without active arbitrary code running on behalf of websites, control would be substantially tilted in favor of user clients.

There was probably no future where AJAX didn't evolve, but if it hadn't then things would look different.


I think the problem is a different one.

What we needed (still need) is a better sandbox for native code. So you can feel as comfortable running some offline app found in some rando's Reddit post as you do clicking on a link from same. Which, in turn, lets untrusted people create interesting new apps without any gatekeepers or barriers to entry.

Right now the same thing happens with the web, but because web browsers are inherently client-server, that creates an implied server dependency and then all of the misfeatures that come with it.


JavaScript failed to take us to a good place because it was not designed to resist corruption. You might make similar arguments about money.

If we went back and replaced those media with less naive designs, I think it's reasonable to believe that they'd take us somewhere different. We are hackers after all, reshaping technology is what we do.

Of course let's also address the lack of scruples that got us here... In fact, let's each do that primarily in our own ways, I'm just saying that this might not be the forum for that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: