Hacker Newsnew | past | comments | ask | show | jobs | submit | __alexs's commentslogin

Why not just install it from Steam?

It is actually one of the most alive and welcoming spaces in London.

I will never contribute to a project that runs on this sort of ridiculous popularity contest system.

Hint: every software project at every company runs on this sort of ridiculous popularity contest system, the rules of the game are just not publicized.

Yeah but I get paid for that.

Why would secrets ever need to be available to the agent directly rather than hidden inside the tool calling framework?

Creator of Matchlock here. Mostly for performance and usability. For interacting with external APIs like GCP or GitHub that generally have huge surface area, it's much more token-efficient and easier to set up if you just give the agent gcloud and gh CLI tools and the secrets to use them (in our case fake ones), compared to wiring up a full-blown MCP server. Plus, agents tend to perform better with CLI tools since they've been heavily RL'd on them.

That doesn't add up to me at all. Agents are RLd on tool usage just as hard and you can provide an "authed API call" tool to whatever you want.

Token efficiency is a good argument actually.

Sometimes people are too lazy to write their own agent loop and decided to run off-the-shelf coding agent (e.g. Claude Code, or Pi in case of clawdbot) in environment.

Exactly.

> you should theoretically be able to take the output of the Lidar + Cameras model and use it as training data for a Camera only model.

Why should you be able to do that exactly? Human vision is frequently tricked by it's lack of depth data.


"Exactly" is impossible: there are multiple Lidar samples that would map to the same camera sample. But what training would do is build a model that could infer the most likely Lidar representation from a camera representation. There would still be cases where the most likely Lidar for a camera input isn't a useful/good representation of reality, e.g. a scene with very high dynamic range.

This is not true, there is just a specific protocol you have follow to do the building work. Yes it increases costs, but it doesn't explicitly prevent them.

Solar cells have exactly the same power rating on earth as in space surely? What would change is their capacity factor and so energy generation.

Solar modules you can buy for your house usually have quoted power ratings at "max STC" or Standard Testing Conditions, which are based on insolation on Earth's surface.

https://wiki.pvmet.org/index.php?title=Standard_Test_Conditi...

So, a "400W panel" is rated to produce 400W at standard testing conditions.

I'm not sure how relevant that is to the numbers being thrown around in this thread, but thought I'd provide context.


That's super interesting.

STC uses an irradiance of irradiance 1000W/m2, in space it seems like you get closer to 1400W/m2. That's definitely better, but also not enormously better.

Seems also like they are rated at 25C, I am certainly not a space engineer but that seems kind of temperate for space where cooling is more of a challenge.

Seems like it might balance out to more like 1.1x to 1.3x more power in space?


Satellites can adjust attitude so that the panels are always normal to the incident rays for maximum energy capture. And no weather/dust.

You also don't usually use the same exact kind of panels as terrestrial solar farms. Since you are going to space, you spend the extra money to get the highest possible efficiency in terms of W/kg. Terrestrial usually optimizes for W/$ nameplate capacity LCOE, which also includes installation and other costs.


For one or a few-off expensive satellites that are intended to last 10-20 years, then yes. But in this case the satellites will be more disposable and the game plan is to launch tons of them at the lowest cost per satellite and let the sheer numbers take care of reliability concerns.

It is similar to the biological tradeoff of having a few offspring and investing heavily in their safety and growth vs having thousands off offspring and investing nothing in their safety and growth.


The atmosphere is in the way, and they get pretty dirty on earth. Also it doesn't rain or get cloudy in space

Sure but like, just use even more solar panels? You can probably buy a lot of them for the cost of a satellite.

The cost of putting them up there is a lot more than the cost of the cells

  >just use even more solar panels
I think it's because at this scale a significant limit becomes the global production capacity for solar cells, and SpaceX is in the business of cheaper satellites and launch.

“This scale” is not realistic in terms of demand or even capability. We may as well talk about mining Sagittarius A* for neutrons.

You don't even need a particularly large scale, it's efficient resource utilization.

Humanity has a finite (and too small) capacity for building solar panels. AI requires lots of power already. So the question is, do you want AI to consume X (where X is a pretty big chunk of the pie), or five times X, from that total supply?

Using less PV is great, but only if the total cost ends up cheaper than installing 5X the capacity as terrestrial PV farms, along with daily smoothing batteries.

SpaceX is only skating to where they predict the cost puck will be.


You seem to be ignoring the substantial resource cost of putting them up there.

I'm not ignoring it, I'm just realizing that it's on a substantial cost decline curve.

SpaceX knows that too, obviously.


And in geostationary, the planet hardly ever gets in the way. They get full sun 99.5% of the year.

Boosting to geostationary orbit knocks a big chunk out of your payload capacity. Falcon 9 expendable will do 22 tons to LEO and about 8 tons to GTO.

That's still a smaller ratio than the ~4X gain in irradiance over LEO. But if you're doing it at scale you could use orbital tugs with ion drives or something, and use much less fuel per transfer.

It's probably not competitive at all without having fully reusable launch rockets, so the cost to LEO is a lot lower.


8 tons over 22 is a little over 1/3rd the original payload to LEO. If 4x the solar generation potential (not irradiance - the sun is not 4x brighter in space at Earth's orbital radius) is the reward, that's putting an incredible premium on a 3x multiplier on launch costs per kg (at minimum - likely higher, you're also inheriting a worse radiation environment).

But those two parameters are not equals: 3x the cost per kg is a much higher number then 4x the solar power.


My response is already contained in my comment above, in the sentences after the first.

even at 10% (say putting it on some northen pile of snow) it is still cheaper to put it on earth than launch it

would you prefer big tech to piss their waste heat into your rivers, soils and atmosphere?

or would you prefer them to go to the bathroom upstairs?

at some point big tech is in a "damned if you do, damned if you don't" situation...


Here in Maine in the depths of winter (late December), 1 m^2 of ground can collect 4 kwh per day (weird units).

That's why people are trying to build solar here. Our power is expensive due partially to failing to build basically any new generation, and some land is very cheap, and the operational cost of a solar farm is minuscule.

Solar farming is basically an idle game in real life and my addiction is making me itchy.

You can overprovision, and you should with how stupidly cheap solar is.

That we aren't spending billions of Federal dollars building solar anywhere we can, as much as we can, is pathetic and stupid and a national tragedy.

We got so excited about dam building that there's no where to build useful dams anymore, and there is significant value to be gained by removing those dams, yet somehow we aren't deploying as much solar as we possibly can?

It's a national security issue. China knows this, and is building appropriately.

The southwest should be generating so much solar power that we sequester carbon from the atmosphere simply because there is nothing else left to do with the power.


I don't disagree

I'm all for efficiency, but I would think a hailstorm of space junk hits a lot harder than one of ice out on the farm.

Except it doesn't melt like regular hail so when further storms come up you could end being hit by the same hail more than once :\


Atmospheric derating brings insolation from about 1.367KW/m2 to about 1.0.

And then there’s that pesky night time and those annoying seasons.

It’s still not even remotely reasonable, but it’s definitely much higher in space.


> And then there’s that pesky night time and those annoying seasons.

The two options there are cluttering up the dawn dusk polar orbit more or going to high earth orbit so that you stay out of the shadow of the earth... and geostationary orbits are also in rather high demand.


Put them super super far away and focus all the energy into one very narrow death laser that we trust the tech company to be careful with.

Tesla has a P/E in the hundreds and a ~0.3% market cap to profit ratio. In what world is this "very" profitable?

In the world where it makes $8-10B in profit on $90-95 billion in revenue every year. Whatever price investors choose to trade the stock at is irrelevant to those numbers.

It's actually down to $3.8B in profit now, and will be losing money within a year at the rate its been losing profitability.

2% net return on assets is garbage

Just put a slightly larger solar array on the same equipment on earth?

> put a slightly larger solar array on the same equipment on earth?

Land and permitting. I’m not saying the math works. Just that there are envelopes for it to.


The math literally works.

The US mandates by law that we grow a fuck ton of corn to mix 10% ethanol into gasoline.

If you replaced just those cornfields with solar/wind, they would power the entire USA and a 100% electric vehicle fleet. That includes the fact that they are in the corn belt with less than ideal sun conditions.

We aren’t even talking about any farmland that produces actual food or necessary goods, just ethanol as a farm subsidy program.

The US is already horrendously bad at land use. There’s plenty of land. There’s plenty of ability to build more grid capacity.


Hello fello Technology Connections watcher?

You know it!

There is practically infinite land in which to build a datacenter.

> There is practically infinite land in which to build a datacenter

This is absolutely not true. I’ve worked on some of this stuff. Permitting costs months, which in dollar terms pays for launch costs ten-fold.


Solar in space is a very different energy source in terms of required infrastructure. You don't need batteries, the efficiency is much higher, cooling scales with surface area (radiative cooling doesn't work as well through an atmosphere vs. vacuum), no weather/day cycles. Its a very elegant idea if someone can get it working.

Only if you also disregard all the negatives.

The panels suffer radiation damage they don't suffer on Earth. If this is e.g. the same altitude orbits as Starlink, then the satellites they're attached to burn up after around tenth of their ground-rated lifetimes. If they're a little higher, then they're in the Van Allen belts and have a much higher radiation dose. If they're a lot higher, the energy cost to launch is way more.

If you could build any of this on the moon, that would be great; right now, I've heard of no detailed plans to do more with moon rock than use it as aggregate for something else, which means everyone is about as far from making either a PV or compute factory out of moon rock as the residents of North Sentinel Island are.

OK, perhaps that's a little unfair, we do actually know what the moon is made of and they don't, but it's a really big research project just to figure out how to make anything there right now, let alone making a factory that could make them cost-competitive with launching from Earth despite the huge cost of launching from Earth.


> The panels suffer radiation damage they don't suffer on Earth.

I don't think this is true, Starlink satellites have an orbital lifetime of 5-7 years, and GPUs themselves are much more sensitive than solar panels for rad damage. I'd guess the limiting factor is GPU lifetime, so as long as your energy savings outpace the slightly faster gpu depreciation (maybe from 5 -> 3 years) plus cost of launch, it would be economical.

I've said this elsewhere, but based on my envelope math, the cost of launch is the main bottleneck and I think considerably more difficult to solve than any of the other negatives. Even shielding from radiation is a weight issue. Unfortunately all the comments here on HN are focused on the wrong, irrelevant issues like talking about convection in space.


> I don't think this is true, Starlink satellites have an orbital lifetime of 5-7 years,

That's better than I thought, but still means their PV is only lasting order-of 20% of their ground lifespans, so the integrated lifetime energy output per unit mass of PV isn't meaningfully improved by locating them in space, even if they were launched by an efficient electromagnetic system rather than by a rocket.


Sun-synchronous orbit means there's no nightime for satellites in that orbit.

Article is actually wrong and they do provide some advice on this.

https://multimedia.3m.com/mws/media/339742O/3m-full-facepiec...

> Also NIOSH-approved with 3MTM Canister CP3N for use against CS, CN and as a P100 filter (TC-14G-0251) in riot conditions, including those with teargas (non-CBRN).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: