Hint: every software project at every company runs on this sort of ridiculous popularity contest system, the rules of the game are just not publicized.
Creator of Matchlock here. Mostly for performance and usability. For interacting with external APIs like GCP or GitHub that generally have huge surface area, it's much more token-efficient and easier to set up if you just give the agent gcloud and gh CLI tools and the secrets to use them (in our case fake ones), compared to wiring up a full-blown MCP server. Plus, agents tend to perform better with CLI tools since they've been heavily RL'd on them.
Sometimes people are too lazy to write their own agent loop and decided to run off-the-shelf coding agent (e.g. Claude Code, or Pi in case of clawdbot) in environment.
"Exactly" is impossible: there are multiple Lidar samples that would map to the same camera sample. But what training would do is build a model that could infer the most likely Lidar representation from a camera representation. There would still be cases where the most likely Lidar for a camera input isn't a useful/good representation of reality, e.g. a scene with very high dynamic range.
This is not true, there is just a specific protocol you have follow to do the building work. Yes it increases costs, but it doesn't explicitly prevent them.
Solar modules you can buy for your house usually have quoted power ratings at "max STC" or Standard Testing Conditions, which are based on insolation on Earth's surface.
STC uses an irradiance of irradiance 1000W/m2, in space it seems like you get closer to 1400W/m2. That's definitely better, but also not enormously better.
Seems also like they are rated at 25C, I am certainly not a space engineer but that seems kind of temperate for space where cooling is more of a challenge.
Seems like it might balance out to more like 1.1x to 1.3x more power in space?
Satellites can adjust attitude so that the panels are always normal to the incident rays for maximum energy capture. And no weather/dust.
You also don't usually use the same exact kind of panels as terrestrial solar farms. Since you are going to space, you spend the extra money to get the highest possible efficiency in terms of W/kg. Terrestrial usually optimizes for W/$ nameplate capacity LCOE, which also includes installation and other costs.
For one or a few-off expensive satellites that are intended to last 10-20 years, then yes. But in this case the satellites will be more disposable and the game plan is to launch tons of them at the lowest cost per satellite and let the sheer numbers take care of reliability concerns.
It is similar to the biological tradeoff of having a few offspring and investing heavily in their safety and growth vs having thousands off offspring and investing nothing in their safety and growth.
I think it's because at this scale a significant limit becomes the global production capacity for solar cells, and SpaceX is in the business of cheaper satellites and launch.
You don't even need a particularly large scale, it's efficient resource utilization.
Humanity has a finite (and too small) capacity for building solar panels. AI requires lots of power already. So the question is, do you want AI to consume X (where X is a pretty big chunk of the pie), or five times X, from that total supply?
Using less PV is great, but only if the total cost ends up cheaper than installing 5X the capacity as terrestrial PV farms, along with daily smoothing batteries.
SpaceX is only skating to where they predict the cost puck will be.
That's still a smaller ratio than the ~4X gain in irradiance over LEO. But if you're doing it at scale you could use orbital tugs with ion drives or something, and use much less fuel per transfer.
It's probably not competitive at all without having fully reusable launch rockets, so the cost to LEO is a lot lower.
8 tons over 22 is a little over 1/3rd the original payload to LEO. If 4x the solar generation potential (not irradiance - the sun is not 4x brighter in space at Earth's orbital radius) is the reward, that's putting an incredible premium on a 3x multiplier on launch costs per kg (at minimum - likely higher, you're also inheriting a worse radiation environment).
But those two parameters are not equals: 3x the cost per kg is a much higher number then 4x the solar power.
Here in Maine in the depths of winter (late December), 1 m^2 of ground can collect 4 kwh per day (weird units).
That's why people are trying to build solar here. Our power is expensive due partially to failing to build basically any new generation, and some land is very cheap, and the operational cost of a solar farm is minuscule.
Solar farming is basically an idle game in real life and my addiction is making me itchy.
You can overprovision, and you should with how stupidly cheap solar is.
That we aren't spending billions of Federal dollars building solar anywhere we can, as much as we can, is pathetic and stupid and a national tragedy.
We got so excited about dam building that there's no where to build useful dams anymore, and there is significant value to be gained by removing those dams, yet somehow we aren't deploying as much solar as we possibly can?
It's a national security issue. China knows this, and is building appropriately.
The southwest should be generating so much solar power that we sequester carbon from the atmosphere simply because there is nothing else left to do with the power.
> And then there’s that pesky night time and those annoying seasons.
The two options there are cluttering up the dawn dusk polar orbit more or going to high earth orbit so that you stay out of the shadow of the earth... and geostationary orbits are also in rather high demand.
In the world where it makes $8-10B in profit on $90-95 billion in revenue every year. Whatever price investors choose to trade the stock at is irrelevant to those numbers.
The US mandates by law that we grow a fuck ton of corn to mix 10% ethanol into gasoline.
If you replaced just those cornfields with solar/wind, they would power the entire USA and a 100% electric vehicle fleet. That includes the fact that they are in the corn belt with less than ideal
sun conditions.
We aren’t even talking about any farmland that produces actual food or necessary goods, just ethanol as a farm subsidy program.
The US is already horrendously bad at land use. There’s plenty of land. There’s plenty of ability to build more grid capacity.
Solar in space is a very different energy source in terms of required infrastructure. You don't need batteries, the efficiency is much higher, cooling scales with surface area (radiative cooling doesn't work as well through an atmosphere vs. vacuum), no weather/day cycles. Its a very elegant idea if someone can get it working.
The panels suffer radiation damage they don't suffer on Earth. If this is e.g. the same altitude orbits as Starlink, then the satellites they're attached to burn up after around tenth of their ground-rated lifetimes. If they're a little higher, then they're in the Van Allen belts and have a much higher radiation dose. If they're a lot higher, the energy cost to launch is way more.
If you could build any of this on the moon, that would be great; right now, I've heard of no detailed plans to do more with moon rock than use it as aggregate for something else, which means everyone is about as far from making either a PV or compute factory out of moon rock as the residents of North Sentinel Island are.
OK, perhaps that's a little unfair, we do actually know what the moon is made of and they don't, but it's a really big research project just to figure out how to make anything there right now, let alone making a factory that could make them cost-competitive with launching from Earth despite the huge cost of launching from Earth.
> The panels suffer radiation damage they don't suffer on Earth.
I don't think this is true, Starlink satellites have an orbital lifetime of 5-7 years, and GPUs themselves are much more sensitive than solar panels for rad damage. I'd guess the limiting factor is GPU lifetime, so as long as your energy savings outpace the slightly faster gpu depreciation (maybe from 5 -> 3 years) plus cost of launch, it would be economical.
I've said this elsewhere, but based on my envelope math, the cost of launch is the main bottleneck and I think considerably more difficult to solve than any of the other negatives. Even shielding from radiation is a weight issue. Unfortunately all the comments here on HN are focused on the wrong, irrelevant issues like talking about convection in space.
> I don't think this is true, Starlink satellites have an orbital lifetime of 5-7 years,
That's better than I thought, but still means their PV is only lasting order-of 20% of their ground lifespans, so the integrated lifetime energy output per unit mass of PV isn't meaningfully improved by locating them in space, even if they were launched by an efficient electromagnetic system rather than by a rocket.
> Also NIOSH-approved with
3MTM Canister CP3N for use
against CS, CN and as a P100
filter (TC-14G-0251) in riot
conditions, including those
with teargas (non-CBRN).
reply