Yes, on the face of it, the plan is workable. Heat radiation scales linearly with area and exponentially (IIRC) with temperature.
It really is as simple as just adding kilometers of radiatiors. That is, if you ignore the incredible cost of transporting all that mass to orbit and assembly in space. Because there is quite simply no way to fold up kilometer-scale thermal arrays and launch in a single vehicle. There will be assembly required in space.
All in all, if you ignore all practical reality, yes, you can put a datacenter in space!
Once you engage a single brain cell, it becomes obvious that it is actually so impractical as to be literally impossible.
Microsoft has a long and well documented history of resetting user preferences.
Multiple times I've disabled the cortana taskbar search widget, only to have a windows update turn it back on and proudly gives me a popup telling me they noticed it was disabled and turned it back on for me.
Microsoft will forcibly re-enable AI features eventually. Again, this is an established pattern for them.
Starting when you selected "don't install Windows 10" and it asked you a few times and you kept selecting that and then you woke up one day and your computer was running Windows 10, and blue screened when you logged in
I do not mind the cost honestly. And a bit slower also works. I just use one older mac ultra 2/192G ram and another with an rtx5060/16G and an and r9700/32G. Between those I get my models working fine.
That also gives me full privacy. And that is worth way way way more than any cost.
Microsoft, Google, et al very famously spy on everything you do and have no compunctions about handing that data to the US government, regardless of whether the person is a US citizen.
Take this idea one step further. Microsoft, Google, et al also snoop on what foreign governments do with their software and report back to USGov.
> and have no compunctions about handing that data to the US government,
Every government can and will compel companies within their jurisdiction to hand over data for legal cases.
Don’t think that this is a uniquely American property. If your data sits on servers within the control of any company that operates in a country, that country can and will apply legal pressure upon those companies to extract the data.
> Every government can and will compel companies within their jurisdiction to hand over data for legal cases.
I'm not sure of your point. This is an excellent argument as to why the French government should run their government videoconferencing and chat on infrastructure in France, as they plan to do, isn't it? Using software that they have vetted. Regardless of if this is a "uniquely American" thing or not.
Right. I’m not disagreeing with that. A country should run their official business on tools that aren’t trivially liable for extraction by foreign governments.
The point was in response to the above comment. All governments can and will compel companies to turn over data. It’s often framed on HN as a feature of only American companies but it’s actually universal.
It happens that the major tech platforms are all US-based, so it's more relevant to talk about US government policy than any other. Even if they are all like that.
But, in addition, the US government has recently become more pushy and less friendly than it was before, which is prompting many other nations to re-assess their dependence on the tech of what was until recently a close ally. The headline is an example.
It seems to me more about "this foreign government is most relevant" than "only this foreign government is like that".
Are you sure about that? All the normies use streaming services for music and movies. Techies around here tend to too. The normies don't know about and can't work torrents. They can't even work their own file system. The techies decry it as "inconvenient".
Yes, but the added mass makes it prohibitively expensive. Shielding is heavy and every kilogram of added payload results in a geometric increase in fuel load.
The rocket equation will kick your ass every time.
The problem of datacenters in space and knowledge preservation/disaster redundancy are entirely disjoint.
Datacenters in space have a lifespan measured in years. Single-digit years. Communicating with such an installation requires relatively advanced technology. In an extinction level crisis, there will be extremely little chance of finding someone with the equipment, expertise, and power to download bulk data. And don't forget that you have less than a decade to access this data before the constellation either fails or deorbits.
Meanwhile people who actually care about preserving knowledge in a doomsday crisis have created film reels containing a dump of GitHub and enough preamble that civilizations in the far future can reconstruct an x86 machine from scratch. These are buried under glaciers on earth.
We've also launched (something like) a microfilm dump of knowledge to the moon which can be recovered and read manually any time within the next several hundred or thousand years.
Datacenters in space don't solve any of the problems posed because they simply will not last long enough.
reply