People hated steam when it launched but you needed it to play CS 1.6. It made installing mods easier. Then HL2 released, orange box, and they were able to get a critical mass as they provided platforms support for other games. Steam got better. It’s still not great but they have so much market share that basically any PC gamer already has it. Epic wants some of that money. The problem is nobody wants to install another store and they aren’t doing anything to improve gamer’s experience other than giving away games and having some exclusives. They’ll never hit the critical mass needed that way.
I still don't like Steam. I resent that I have to have this "Store" middle man on my computer just to have access to games. I want to pay a company for their product on their web site, download the installer, and install it on my operating system directly. I don't want this other layer that I'm dependent on, who could switch off my access to the things I "bought" whenever they want.
Steam has multiplayer integration so you don't have to connect by IP to play indie games, that is massive. So many people either don't have access to their router or don't have the skills to configure it to play multiplayer without steam without having a server middleman which most indie games wont have.
Then steam reviews are the most accurate reviews there are for how likely you are to be happy with the purchase. I am much more hesitant to spend money on a game where I can't see the steam reviews for, so there is basically no way I'll buy a game on epic store that doesn't exist on steam since I am basically buying it blind.
Looking at what many of your games do I think it is better option. I have zero doubt that there wouldn't be countless downloaders and accounts and poorly written startup menus for each game and each publisher both big and small.
Simply getting installer would not be option for most games.
I vaguely recall Half-Life 2's launch being pretty problematic.
I started using Steam in 2007 and it was fine. In 2006 there was still some residual animosity towards it but I think the tide had well and truly turned since the early days (and I think there were a handful of third party games on it by then too which I guess was something of a vote of confidence - a few were Source engine titles so they may have got a discount or kick-back from Valve, but not all were).
> People hated steam when it launched but you needed it to play CS 1.6.
I thought CSS was the first release on steam beta? I remember playing the crap out of it, then the actual steam release happened, and it somehow turned into a laggy buggy hunk of crap for months.
Not sure of the exact history but pretty sure Steam was launched as a beta in 2002, or maybe 2003. In September 2003 or so the database was wiped and Steam was launched, again with probably only Counter-Strike available.
Counter-Strike Source was launched some time in 2004 and then Half-Life 2 came along in November 2004.
I mean, people really didn't hate it. There was some grumbling about digital and not having a cd, but by and large people liked it as soon as they had broadband.
Fleet was a terrible product. I’m a long time jetbrains user and still use goland, rust rover, and clion. I was really excited when it got announced because I had had a lot of issues with vs code, extensions, and lsp at the time. I was hoping jetbrains was going to build something competitive but it never materialized. Rather than building something lightweight and fast on a native ui toolkit with faster analysis engines they basically built on top of a lot of the tech that was the bad parts of their existing IDEs. important features and the ability for community extensions to fill them never came to fleet which meant it never could replace other tools for real work and jetbrains seemed to focus on building LLM features to capture the hype rather than fixing the issues people have with their existing products. Instead they have mediocre ai features that are essentially commoditized and IDEs that are under invested in that are sluggish. In terms of full batteries included IDEs they’re still probably the best, but they’re losing a lot of the market.
Up or out generally stops once someone reaches engineer or sr engineer. Most of the time a jr engineer is going to need substantial mentoring and support. Them never moving beyond that point likely results in a net negative gain if you need another person always available to provide that for their entire time there if it goes beyond 1-2 years.
> Update (August 2024): GeoPolars is blocked on Polars supporting Arrow extension types, which would allow GeoPolars to persist geometry type information and coordinate reference system (CRS) metadata. It's not feasible to create a geopolars. GeoDataFrame as a subclass of a polars. DataFrame (similar to how the geopandas. GeoDataFrame is a subclass of pandas.DataFrame) because polars explicitly does not support subclassing of core data types.
Unless he’s getting into a truly top tier school, have him go to a state university for much less. Community college and transferring is also an option but the social experience of going to a 4 year university is unique and fun. If you have something local where he can live at home or work enough while in school to cover part of the expenses, great. Spend $30-40k instead. I went to a fine state school in California and have talked to a lot of CS grads. I’ve rarely seen significant value add for paying more for a mid tier school than whatever the cheaper average option is.
For the most part, everything in your last point is something you can’t solve with an app. Virtually all of that would be problematic meeting someone in any other way and for the most part is all within your control to change. Sure, there are limits to what you can do about conventional attractiveness but there is a lot within your control for most people. Most of those issues you brought up are things that the vast majority of people don’t want in a partner no matter how you meet them. If you like those attributes about yourself you might eventually find someone compatible but if you don’t, most/all of those are things you can address by going to therapy and working on it.
Building Based Security (https://www.basedsec.io), a startup in the zero trust/identity space creating immovable, attestable, hardware backed identities that can be used for strong and continuous authentication but not moved from the device. We can guarantee the user and the device are known, the user is assigned the device they are using, and characteristics of the identity matches the defined policy. The initial product offering is tied to GitHub/GitLab, offering secure authentication for git over ssh operations with plans to expand to general ssh access and additional systems with pluggable access control.
We’re leveraging ssh certificates which are backed by keys stored in a variety of hardware. For yubikeys we’re leveraging piv and the standard ssh tooling. We’re determining whether we’ll be able to use a pkcs11 implementation for TPMs and Secure Enclave or whether we’ll need to build a custom agent.
I think it's interesting they're choosing to use certificates this way. If they're already using certs, why not just leverage sshca auth? Also, at the end of the day, it's still effectively a bearer token. I founded a company called Based Security last year in this space. We're looking for design partners currently. We host a CA for you (or you can host yourself if you want) and use ssh certificates and bind the user identity (oidc to the IdP) to a physical device (yubikey, secure enclave, tpm, etc.) This ensures that the user is both in possession of the physical device and that the credential can't be stolen without stealing the device, unlike the bearer token examples here. Currently we're offering support for GitHub and GitLab authentication but it works out of the box with standard ssh tooling as well. It just currently requires manually handling user provisioning for standard ssh access.
Because that has two trusted parties: the IDP and the SSH CA. OPKSSH has just one trusted party: the IDP.
> This ensures that the user is both in possession of the physical device and that the credential can't be stolen without stealing the device, unlike the bearer token examples here. Currently we're offering support for GitHub and GitLab authentication but it works out of the box with standard ssh tooling as well. It just currently requires manually handling user provisioning for standard ssh access.
That sounds valuable.
Have you looked in OpenPubkey, the cosigner protocol supports binding hardware tokens to ID Tokens? Although not as fancy as having the SSH key pair live in the hardware token but maybe we could figure out a way to get the best of both worlds.
I can understand the concern about having a second trusted party but think that the value of utilizing the standard ssh ca auth flow is worth the potential risk. If you require keys in attested hardware and verify that before issuing certs, the actual attack becomes very difficult. You need to compromise the actual hardware or compromise the CA in a pretty substantial way to issue certs to untrusted private keys. The certificate alone doesn't actually do anything without the key. In addition to just being supported out of the box, we can also issue hardware bound host keys, which allow us to offer bi-directional verification. We gain the benefit of all the standard PKI tooling (eg. revocation lists, ACME, etc.) and can use the same PKI for other scenarios (eg. mTLS, piv, etc.) by issuing x509 certificates instead. That's our long term plan is moving past ssh auth and having it be an attestable, immovable, hardware backed identity that can be usable for continuous authentication in other areas.
I have looked into OpenPubKey briefly in the past but haven't spent a ton of time with it. We were going in a very different direction and it didn't seem particularly useful based on our goals or what we wanted to achieve.
edit:
Looking at the documentation https://docs.bastionzero.com/openpubkey-ssh/openpubkey-ssh/i... It seems like to use OpenPubKey you also need a fairly modern version of OpenSSH. It also requires that the user authenticating have sudo access on the machine, which doesn't sound great. It's not clear to me whether it's possible for the existing authorized_keys file to co-exist or whether that's just to stop access using existing keys but using the standard ssh certs will co-exist allowing for a non-binary rollout if there are use cases that need to be worked around.
That documentation refers to a much older and closed source version of opkssh.
> It seems like to use OpenPubKey you also need a fairly modern version of OpenSSH.
On versions of OpenSSH older than 8.1 (2019), you may run into issues if you have a huge ID Token. That shouldn't be a problem for standard sized ID Tokens, some enterprise OIDC solutions put the phone book in an ID Token and we have to care about that.
> It also requires that the user authenticating have sudo access on the machine, which doesn't sound great.
The user authenticating does not need sudo access. You only need sudo access to install it. You need sudo to install most software in on servers.
> It's not clear to me whether it's possible for the existing authorized_keys file to co-exist or whether that's just to stop access using existing keys
opkssh works just fine in parallel to authorized_keys. We are using AuthorizedKeyCommand config option in sshd_config, so opkssh functions like an additional authorized_keys file. My recommendation is that you use authorized_keys as a breakglass mechanism.
I was wondering the same. In my corner of IT, everyone is ditching SAML as 'outdated' over OIDC2 (usually against Azure). It's pretty painless in comparison and seems to require less maintenance - even tho I know the SAML2 spec well.