Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Calling Nvidia niche feels a bit wild given their status-quo right now, but from a foundry perspective, it seems true. Apple is the anchor tenant that keeps the lights on across 12 different mature and leading-edge fabs.

Nvidia is the high-frequency trader hammering the newest node until the arb closes. Stability usually trades at a discount during a boom, but Wei knows the smartphone replacement cycle is the only predictable cash flow. Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?





So let's say TMSC reciprocated Apple's consistency as a customer by giving them preferential treatment for capacity. It's good business after all.

However, everyone knows that good faith reciprocity at that scale is not rewarded. Apple is ruthless. There are probably thousands of untold stories of how hard Apple has hammered it's suppliers over the years.

While Apple has good consumer brand loyalty, they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.


At this scale and volume, it's not really about good faith.

Changing fabs is non-trivial. If they pushed Apple to a point where they had to find an alternative (which is another story) and Apple did switch, they would have to work extra hard to get them back in the future. Apple wouldn't want to invest twice in changing back and forth.

On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.

At this level, everyone knows it's just business and it comes down to optimizing long-term risk/reward for each party.


Apple has used both Samsung and TSMC for its chips in the past. Until the A7 it was Samsung, A8 was TSMC, and the A9 was dual-sourced by both! Apple is used to switching between suppliers fairly often for a tech company; it's not that it's too hard for them to switch fab, it's that TSMC is the only competitive fab right now.

There are rumours that Intel might have won some business from them in 2 years. I could totally see Apple turning to Intel for the Mac chips, since they're much lower volume. I know it sounds crazy, we just got rid of Intel, but I'm talking using Intel as a fab, not going back to x86. Those are done.


But wasn't the reason they split with Samsung because they copied the iphone in the perspective of Jobs (to which he reacted with thermonuclear threats)?

They did had the expertise building it after all. What would happen, if TSMC now would build a M1 clone? I doubt this is a way anyone wants to go, but it seems a implied threat to me that is calculated in.


Job's thermonuclear threats were about Android & Google, not Samsung because Schmidt was on Apple's board during the development of Android.

> "I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go thermonuclear war on this."

The falling out with Samsung was related, but more about the physical look of the phone


> because it’s a stolen product

This is funny coming from Jobs.


> good artists copy great artists steal - pablo picasso

- steve jobs


Oh shit I didn't know he was on the board, I thought the story was just that they decided to be a competitor.

So, Schmidt had inside knowledge of before following Apple into the smartphone category? That makes the vengeful fury less unhinged.

Sounds like those $40b did not end up running out.


If Samsung (or any other fab) were to make Apple chips they wouldn’t learn anything that a good microscope couldn’t already tell them.

Samsung still makes the displays and the cameras for most iPhones. They continued to do business even while engaged in legal action. That they are still competitors wont stop them doing business when it suits them. Business doesn’t care about pride or loyalty; only money.


I believe just locking at a chip, does not enable you to to make such a chip, otherwise china would not be behind.

TSMC already makes them in their labs. They could tweak a few things, claim it is novel and just sell to the competition. (Apple would fight back of course with all they have and TSMC reputation would take damage)


Looking at a chip makes it easier, but it is still millions (or billions in the case of a CPU) of dollars for engineers to figure it all out. That doesn't get you to understand what was done or why so 2-3 years latter you can make that chip but they have now moved on to a faster/better version and you are behind. And of course if you try this Apple (or whoever you copy) will have plenty of engineers who can look at your chip and in just a few hours decide there is enough to have lawyers sue you for the copy.

China already has plenty of engineers who can make a chip, and experience with making CPUs. ARM licenses a lot of useful things for making a CPU (I don't know what). They would be better off in the long run making the chips they all ready understand better. Which is something they are doing. It takes longer and costs more, but because they understand they can also customize the next chip for something they think is good - if they are right they can be ahead of everyone else.

What China is lacking is the fabs to make a CPU. They have made good progress in building them, but there is a lot of technology that isn't in the chip that is needed to make a chip.


It took cerebras less than a billion to get to where they are now, CPUs are not that hard. You would probably be able to reverse engineer them for ~100 million

Doesn't seem likely, TBH. Nevermind the legal agreements they would be violating, TSMC fabs Qualcomm's Snapdragon line of ARM processors. The M1 is good, but not that good (it's a couple generations old by this point, for one). Samsung had a phone line of their own to put it in as well. TSMC does not.

>They did had the expertise building it after all. What would happen, if TSMC now would build a M1 clone

What do you mean by cloning? An exact copy of Apple SOC? What would that be useful for?

There are already other ARM SOCs that are as performant as Apple's, according to benchmarks.


I thought Intel was too far behind on their process nodes?

At the end of the month, laptops with Intel's latest processors will start shipping. These use Intel's 18A process for the CPU chiplet. That makes Intel the first fab to ship a process using backside power delivery. There's no third party testing yet to verify if Intel is still far behind TSMC when power, performance and die size are all considered, but Intel is definitely making progress, and their execs have been promising more for the future, such as their 14A process.

I did say in two years. Intel can still fail the validation along the way.

>Apple has used both Samsung and TSMC for its chips in the past. Until the A7 it was Samsung, A8 was TSMC, and the A9 was dual-sourced by both! Apple is used to switching between suppliers fairly often for a tech company; it's not that it's too hard for them to switch fab, it's that TSMC is the only competitive fab right now.

This is false. Samsung competes with Apple on smartphones. Apple even filed a lawsuit against Samsung over smartphones.

Apple moved to TSMC because how can you trust someone to make chips for you containing your phone's core IP?

>I could totally see Apple turning to Intel for the Mac chips

I could totally see Apple will be wary turning their core IPs to Intel


Which but is false? Samsung definitely did manufacture Apple chips.

Common manufacturer Samsung[2]

https://en.wikipedia.org/wiki/TSMC

Apple A6 which is fabricated with Samsung 32 nm HKMG (Hi dielectric K, Metal Gate) CMOS process

https://www.ifixit.com/Teardown/Apple+A6+Teardown/10528


TSMC holds the real power. Apple’s stability and Nvidia’s cash both matter but AI demand is distorting the entire semiconductor ecosystem. There are no easy exits. Building fabs, switching suppliers or waiting out the cycle all carry massive risk.

In the long run, competition (where via Intel, Samsung or geopolitical diversification) is the only path that benefits anyone other than TSMC


Trust comes first. That's why TSMC is a pure play fab. Unless there's something that can 100% guarantee protection for fabless players like Apple, no one will trust Samsung or Intel.

Fabless players' IPs are their entire business.

It'll be hard to trust Intel given Intel's past behavior, especially against AMD.


Hasn't Apple recently made a deal with Intel?

Trust is not binary — it is a spectrum.

Anyone making a claim that trust will be 0% based on a single thing is obviously oversimplifying the situation. Trust is built on behavior, reputation, time, repeatability, etc.

Trust is subjective and relative. If Alice doesn’t trust Eve, that doesn’t automatically mean that Bob doesn’t trust Eve. That usually requires both Alice and Bob to similar experiences or Bob must have a trust relationship with Alice.


Trust also changes over time. One CEO change and a company can change overnight thus causing all trust to evaporate. Normally CEOs are aware of this and don't change things and so trust transfers, but one mistake and you lose trust. It takes a lot to build back trust, but a few years of proving worthy of trust and it starts to come back. If your competitors violates trust in the mean time customers are more likely to risk you, and if you prove trustworthy the customers are likely to stay.

There are other factors than trust as well - the US government really wants intel fabs to take off and they may be applying pressure that we are not aware of. It could well be that Apple is willing to risk Intel because the US government will buy a lot of macs/iphones but only if they CPU is made in the US. (this would be a smart thing for the US todo for geopolitical reasons)


Then why are they switching from Sony to Samsung for custom camera sensors for the next iPhone?

Why do they keep using Samsung for their customized screens despite LG and Chinese competitors being competitive?


Does Apple spend R&D on iPhone screens like they do Apple Silicon? What's that got to do with what we're talking about regarding iPhone's core IP (Apple's own chip, the most important IP from Apple)?

Apple has run micro LED development for several years

> Does Apple spend R&D on iPhone screens like they do Apple Silicon

yes

> What's that got to do with what we're talking about regarding iPhone's core IP

The iPhone's core IP is iOS.

Collaboration on display and camera development leak major future milestones. Far more consumers care about cameras and displays than the CPU. Just like the camera and display the CPU IP is also protected by patents.


wait til you find out who supplies iPhone screens.

Does Apple spend R&D on iPhone screens like they do Apple Silicon? What's that got to do with what we're talking about regarding iPhone's core IP (Apple's own chip, the most important IP from Apple)?

Apple owns a few patients on micro LED display. Those look like R&D to my untrained eye.

https://www.ledinside.com/node/31822


Apple is the company that just over 10 years ago made a strategic move to remove Intel from their supply chain by purchasing a semiconductor firm and licensing ARM. Managing 'painful' transitions is a core competency of theirs.

I think you’re correct that they’re good at just ripping the band-aid off, but the details seem off. AFAIK, Apple has always had a license with ARM and a very unique one since they were one of the initial investors when it was spun out from Acorn. In fact, my understanding is that Apple is the one that insisted they call themselves Advanced RISC Machines Ltd. because they did not want Acorn (a competitor) in the name of a company they were investing in.

Correct, from the ARM Wikipedia entry:

The new Apple–ARM work would eventually evolve into the ARM6, first released in early 1992. Apple used the ARM6-based ARM610 as the basis for their Apple Newton PDA.


Doesn't Apple have an ARM "Architectural License" arising from being one of the original founding firms behind ARM, which they helped create back in the 90s for the Apple Newton. That license allows them to design their own ARM-compatible chips. The companies they bought more recently gave them the talent to use their existing license, but they always had the right to design their own chips.

Which acquisition are you referring to? Apple bought PA Semi in 2008 and Intrinsity in 2010.

PA, Intrinsity wasn't front of mind for me. My point is, Apple has proven they can buy their way into vertical integration, let's look at the history.

68K -> PowerPC, practically seamless

Mac OS 9 -> BSD / OS X with excellent backward compatibility

PowerPC -> x86

x86 -> ARM

Each major transition, biting off orders of magnitude more complexity of integration. Looking at this continuum, the next logical vertical integration step for Apple is fabrication. The only question in my mind, does Tim have the guts to take that risk.


Not all of Apple‘s chips need to be fabbed at the smallest size, those could certainly go elsewhere. I’m sure they already do.

Is there anyone who can match TSMC at this point for the top of the line M or A chips? Even if Intel was ready and Apple wanted to would they be able to supply even 10% of what Apple needs for the yearly iPhone supply?


    > Not all of Apple‘s chips need to be fabbed at the smallest size, those could certainly go elsewhere.
When I saw that TSMC continues to run old fabs, I immediately thought about this idea. I am sure when Apple is designing various chips for their products, they design for a specific node based on available capacity. Not all chips need to be the smallest node size.

Another thing: I am seeing a bunch of comments here alluding to Apple changing fabs. While I am not an expert, it is surely much harder than people understand. The precise process of how transistors are made is different in each fab. I highly doubt it is trivial to change fabs.


My understanding, and I’m a layman, is it basically requires making new masks. And that’s not trivial.

I guess you’d be doing that anyway with a brand new chip. But still probably easier to work with the tools/fab you know well.

I suppose you’d have to do it just switching nodes at TSMC. Which is why the A13 (or whatever) probably never moves to smaller nodes.

Sometimes Apple updates the chip in a product that doesn’t seem to need it, like the AppleTV. I wonder if it’s because the old node is going away and it’s easier to just use a newer chip that was designed for the newer node.


Apple knows first hand how difficult and expensive it is to fire your CEO, I mean chip fab, only to rehire them when its clear that decision didn't pan out.

I would imagine they could split their orders between different fabricators; they can put in orders for the most cutting edge chips for the latest Macs and iPhones at TSMC and go elsewhere for less cutting edge chips?

presumably they already do that (since non cutting edge chip fab is likely to be more competitive and less expensive) so, given they are already doing that, this problem refers to the cutting edge allocations which are getting scare as exemplified at least by Nvidia's growth

I think this misses a key point. TSMC is leading edge. When Apple switched they were leading edge for pure play, but not far ahead of Samsung and certainly behind Intel. Now not only TSMC is the best, it is also the largest. Which means Apple don't have a choice.

It the old days the leverage was that without Apple, no one is willing to pay for leading edge foundry development, at least not enough money to make it so compared to Apple. Now it is different. The demands for AI meant plenty of money to go around. And Nvidia is the one to beat, not Apple any more. The good thing for Apple is that as long as Nvidia continues to grow, their order can be spilt between them. No more relying on single vendor to pus.


It's ridiculous that a trillion dollar company feels beholden to a supplier. With that kind of money, it should be trivial to switch. People forget Nvidia didn't even exist 35 years ago. It would probably take like 3 to 5 years to catch up with the benefit of hindsight and existing talent and tools?

And anyway consumers don't really need beefy devices nowadays. Running local LLM on a smartphone is a terrible idea due to battery life and no graphics card; AI is going to be running on servers for quite some time if not forever.

It's almost as if there is a constant war to suppress engineer wages... That's the only variable being affected here which could benefit from increased competition.

If tech sector is so anti-competitive, the government should just seize it and nationalize it. It's not capitalism when these megacorps put all this superficial pressure but end up making deals all the time. We need more competition, no deals! If they don't have competition, might as well have communism.


There is a big waiting list for fab tools. You can't just spin that up out of nowhere. Modern chip fabs are the most complex things ever created, and till you spun up your own fab, supply and demand will have balanced out.

Also, how is nationalizing something pro-competition? Nationalized companies have a history of using their government connections to squash competition.


It can be interpreted a different way too. Apple is just a channel for TSMCs technology. Also the cost to build a fab that advanced, in say a 3 year horizon, let alone immediately available, is not one even Apple can commit to without cannibalising its core business.

I know you are maybe joking but I don't think the government nationalizing the tech sector would be a good idea. They can pull down the salaries even more if they want. It can become a dead end job with you stuck on archaic technology from older systems.

Government jobs should only be an option if there are enough social benefits.


I'm joking yes but as an engineer who has seen the bureaucracy in most big tech companies, the joke is getting less funny over time.

I've met many software engineers who call themselves communists. I can kind of understand. This kind of communist-like bureaucracy doesn't work well in a capitalist environment.

It's painful to work in tech. It's like our hands are tied and are forced to do things in a way we know is inefficient. Companies use 'security' as an excuse to restrict options (tools and platforms), treat engineers as replaceable cogs as an alternative to trusting them to do their job properly... And the companies harvest what they sow. They get reliable cogs, well versed in compliance and groupthink and also coincidentally full-blown communists; they're the only engineers remaining who actually enjoy the insane bureaucracy and the social climbing opportunities it represents given the lack of talent.


I understand completely.

I'm going through a computer engineering degree at the moment, but I am thinking about pursuing Law later on.

Looking at other paths: Medicine requires expensive schooling and isn't really an option after a certain age and law, on the other hand, opened its doors too widely and now has a large underclass of people with third-tier law degrees.

Perhaps you can try to accept the realities of the system while trying to live the best life that you can?

Psyching yourself all the way, trying to find some sort of escape towards a good life with freedom later on...


Maybe consider patent law? I have a friend who worked for the patent office, and the patent office paid for their law school. Now they’re a patent attorney and doing quite well.

Nice advice. I was also considering something to do with cybercrimes, leveraging the initial degree, but your advice got me thinking!

Sounds like you should just leave the company if you are that unhappy

Bruh, with some very rare exceptions like valve, every company is run as a dictatorship or oligarchy. That goes beyond tech, hell big tech at least gives some agency to their engineers.

The only way you don’t need to be versed in compliance or group think at a US firm as an employee is to either be

1) independently wealthy, so your job is a hobby you can walk away from

2) have some leverage on a currently in demand skill, but the second that leverage evaporates they will demand the compliance

Also I realized I undersold it, they aren’t just run as dictatorships/oligarchies, they are usually run as command economies as well.

The whole capitalist competition style behavior only happens with inter firm interactions, not internal ones


Find a small company with a founder who loves their team and wants them to be happy. They exist, I assure you. They're not even rare.

I spent most of my career working in companies with <50 employees, and only hit a couple of unpleasant founders. The few large companies that I worked in were always bureaucratic nightmares by comparison.

Small companies won't pay FAANG salaries, but they also won't make you feel like a meaningless cog in a vast unsympathetic, unproductive, machine.


> I spent most of my career working in companies with <50 employees

I’ve worked for 3 companies like that. It was really great if your views aligned with the founder. If they didn’t, you got fucked.

I really enjoyed when a bunch of juniors were fired the day before Christmas because the founder heard them discussing the latest movies they watched and decided that they had bad opinions and shouldn’t work at his company since he’d be embarrassed if his peers heard their tastes. Not hyperbole, direct statements. We referred to it as the Red Christmas at the time.

I believe you got lucky, I don’t find your advice actionable.


I've had a couple of experiences like yours, yeah, it can be a matter of finding the right founder.

I'm sorry you don't find it actionable. Please continue doing whatever you're doing now that is working for you.


>Please continue doing whatever you're doing now that is working for you

Lol.

It doesn't work out because I don't have leverage, and tried to stand up for what I believe in. I also don't believe it would work for you unless you had views that aligned with the current oligarchical leadership that the entire US industry is operating under.

If you only have a good time when you found the "right" founder, because they will and are capable of harming your career or income when you disagree with them, and the law does effectively nothing to protect you from their ego driven tantrums, then you are a serf at best.

I'd agree with you if it was relatively common that employees who had differences of opinions with founders of companies, weren't forced out, but that is not my experience.

I do not find contentment out of accepting that some assholes are my Betters because they have more money than me.


What is odd to me is hearing people talk as if somehow a job is supposed to be intrinsically enjoyable or enriching. Paid labor is and always has been a subservient role that pays exactly the minimum that the market allows for the circumstances.

Labor is the next option above slavery and indenture, and now that slavery and indenture are frowned upon, labor has absorbed that space as well.

If you want to have some control of your environment and destiny, you must be an independent agent, a contractor, entrepreneur, or consultant. A tradesman. You have special skills and expertise, your own tools, and a portfolio of masterpieces at the least.

There is nothing new in this space of human endeavour, it is as it has been, and I suspect will continue to be, for better or for worse. Sacrificing your agency for subservience is going to make you feel at the mercy of your “betters”. If you don’t want that, don’t do that. Labor law and other conventions have made it a little better, but the fundamental relationship is still master and servant.


> Labor is the next option above slavery and indenture, and now that slavery and indenture are frowned upon, labor has absorbed that space as well.

If we go down this path, what can I say that doesn’t get my account banned and my speech suppressed for what what I would suggest doing to people with your opinion?


We don’t have to go down that path, it’s the path we’re already on.

It’s not the way I think it -should be- but it is the way that it is. The incentive alignment keeps it at that local minima, and every attempt to move it to a new one so far has introduced so many perverse incentives that it ultimately causes the regression or even complete failure of the economies it is implemented in.

I don’t know what the answer is that maximises human happiness and minimises human misery, but I suspect it lies well outside of the paradigm of conventional market economics.

Within the dominant paradigm, It’s all a matter of risk management. With employment, you are paying your employer with your surplus value to handle the risks that you feel powerless to manage. Market risks, capital risks.

In exchange, you accept risks that your opinions and comfort won’t be prioritised, and in some cases even your physical well being.

In effect, you are betting against yourself being able to balance those risks against the risks posed by pursuing profitability.

The ability to manage risks is intersectional with your ability to manage discomfort and privation. When you run out of money, the house wins by default.

That’s why the foundational step for anyone should be to do whatever they must to obtain a safe fallback position. A place to be. A safety net. This is what enables risk accommodation. Without taking risk, there will be no advancement. If you don’t have a fallback plan, a safe spawn point, do everything in your power to create one, at least for your children.


> a bunch of juniors were fired the day before Christmas because the founder heard them discussing the latest movies they watched and decided that they had bad opinions and shouldn’t work at his company since he’d be embarrassed if his peers heard their tastes.

Just out of curiosity, was it something despicable like them liking Marvel movies? Or more akin to disagreeing whether Eyes Wide Shut could be considered a good Kubrick movie?


Serious question, why watch this movie by Kubrick when you have way more interesting guys like Pier Paolo Pasolini?

If you want to see weird sexual pictures, might as well go all the way with "120 days of Sodom".

Or just go and see one of those documentaries of serial killers from the 70's, like Ted Bundy.


> It would probably take like 3 to 5 years to catch up with the benefit of hindsight and existing talent and tools?

Are you talking about TSMC - because that is a single, albiet primary, node in a supply chain, that's also what you have to replicate. AMSL is another vital node.

So many people with "it's just a factory, how hard can it be". The answer is "VERY", as a few endavours have found out already - and they will probably find out even at TSMC Arizona.

I shall illustrate with Adrian Thompson's 1996 FPGA experiment at the University of Sussex.

Thompson used a genetic algorithm to evolve a circuit on an FPGA. The task was simple: get it to distinguish between a 1kHz tone and a 10kHz tone using only 100 logic gates and no system clock.

After about 4,000 generations of evolution, the chip could reliably do it but the final program did not work reliably when it was loaded onto other FPGAs of the same type.

When Thompson looked inside at what evolved, he found something baffling:

The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops. Five individual logic cells were functionally disconnected from the rest - with no pathways that would allow them to influence the output - yet when he disabled any one of them the chip lost its ability to discriminate the tones.

Welcome to building semi-conductors.

https://www.damninteresting.com/on-the-origin-of-circuits/


Do you mean AMSL?

There was a great video recently on the company + techniques used for cutting-edge lithography.

https://www.youtube.com/watch?v=MiUHjLxm3V0


ha, yes I did. - luckily still in the edit window

I was expecting an Asianometry video from your link

https://www.youtube.com/@Asianometry

Pure Silicon Crystals for the wafer is another very specialist supplier you can't just decide to become - your local gravity will probably have an effect you need to tune into


>If tech sector is so anti-competitive, the government should just seize it and nationalize it.

Trump is using his DOJ to probe Jerome Powell with a bogus lawsuit because the Fed won't lower rates on demand.

An independent Fed is the most important body for the USA. Lowering rates should be based on facts, not dictated by some bankrupt casino CEO. And now you want our government to nationalize the tech sector?


I don't support nationalizing the tech sector, but I believe the reason we have Trump in the first place is because our government refused to nationalize health care.

>On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.

They're Apple. If TSMC fucks around too much, they might just start working towards building their own fab.


Apple loaned TSMC money in order to build manufacturing capacity back around the M1 era. They’ve done that for a number of suppliers and the “interest payments” were priority access to capacity. Everyone was complaining about how Apple got ARM chips while others had to wait in line.

That said, they did that for a sapphire glass supplier for the Apple Watch and when their machines had QC problems they dropped them like a rock and went back to Corning.

But is that really any different from any other supplier? And who tf do you think they’re going to drop TSMC for right now? They are the cock of the walk.


> And who tf do you think they’re going to drop TSMC for right now?

Don't look now: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...


If Apple cares about their chip IPs, it will be very hard to trust Intel given Intel's past behavior with others like AMD.

If Apple cares about their Softbank investment, the best possible outcome is that Intel copies their IP wholesale. Arm's white whale is Intel buying an architectural license, which they have zero incentive to do unless someone gives them an off-the-shelf core design that doesn't suck.

The modern Cortex and Infiniverse designs are so pathetic that RISC-V might mature by the time ARM is the industry standard. And the smaller ARM IP hasn't been profitable since China mass-produced the clones. Courting Intel into buying an architectural license with a free IP bonus is a legitimately smart move for ARM's longevity, from Apple's POV.


According to benchmarks latest ARM Cortex designs and Qualcomm Snapdragon designs are as performant as Apple's.

About 17 years ago I worked at a company that was clamoring to get products into Costco, when we did I was shocked at the fees they charged us for returns. If they're the gold standard for supplier relations it's a wonder anyone bothers being a supplier.

You were shocked that they didn't absorb the cost of your shipping mistakes?

Why would you assume that Costco returns are due to supplier mistakes?

Costco are legendarily permissive with returns, to extent of things like accepting bare stick-like xmas trees back after xmas, and giving a full refund, but ultimately this is to their advantage in encouraging mindless consumerism (which is also the general American model - no-question-no-fault returns are generally an American thing, not a worldwide one).

Now, a liberal return policy may work out for Costco, and Costco is obviously a high volume hence desirable customer for a supplier, but if Costco is pushing much of the cost of returns back to the supplier, that does change the picture a bit!


Those returned trees don't get sent back to the supplier, they get deducted from a pre-negotiated spoil allowance which is something separate. The supplier returns will be things like badly stacked palettes.

Counter argument is that is NVIDIA friendly to their supply chain? I have to think that maybe they are with their massive margins because they can be - their end buyer is currently willing to absorb costs at no expense. But I don't know, and that will change as their business changes.

Your underlying statement implies that whoever is replacing apple is a better buyer which I don't think is necessarily true.


Nvidia is famously a pain to work with. Apple vowed never to use their chips, Microsoft and Sony can't get them to make any GPU for their consoles.

The only complete package integrator that manages to make a relationship work with Nvidia is Nintendo.


> The only complete package integrator that manages to make a relationship work with Nvidia is Nintendo.

And thats probably because Nintendo isn’t adding any pressure to neither TSMC nor Nvidia capacity wise; iirc Nintendo uses something like Maxwell or Pascal on really mature processes for Switch chips/socs.


And also the Switch 1 was just the hardware for a nvidia shield tablet from nVidia’s perspective, without the downside of managing the customer facing side and with the greater volume from Nintendo’s market reach. (Not that it wasn’t more than that for consumers or Nintendo, just talking nvidia here)

EVGA outright gave up on selling GPUs rather than continue working with NVidia.

> Apple vowed never to use their chips

I thought that was mainly due to bad thermals. I always got the impression that (like Intel) Nvidia only cared about performance, and damn the power consumption.


Nvidia sold defective GPU's that affected every 2007-2008 MacBook Pro. It was a manufacturing defect and every chip was guaranteed to fail. It was a bad look for Apple that cost them millions having to replace logic boards. The defect wasn't corrected for several years leading to some people having multiple logic board replacements.

https://blog.greggant.com/posts/2021/10/13/apple-vs-nvidia-w...


They, and everyone at the time, were kind of forced to switch to lead-free solder by RoHS. At that point, there probably hadn't yet been tests showing the results of constant thermal cycling, so the brittling effect was unknown. Apple was particularly affected as an early adopter because of their PR stance on environmental issues.

Refusing to acknowledge anything was wrong was the real problem. But that's just a reminder that companies don't care about you. Brand loyalty is a quagmire.


Nvidia refused to honour a gentleman's agreement that they were on the hook for recall issues with their GPUs. Steve Jobs didn't like that. One bit.

I think that works out tremendously well for Nintendo, especially when you look at the Wii-U vs the Switch.

I shot a video at CNET in probably 2011 which was a single touchscreen display (i think it was the APX 2500 prototype iirc?) and it has the precise dimensions to the switch 1.

Nintendo was reluctantly a hardware company... they're a game company who can make hardware, but they know they're best when they own the stack.


> EVGA Terminates Relationship With Nvidia, Leaves GPU Business

> According to Han, Nvidia has been difficult to work with for some time now. Like all other GPU board partners, EVGA is only told the price of new products when they're revealed to everyone on stage, making planning difficult when launches occur soon after. Nvidia also has tight control over the pricing of GPUs, limiting what partners can do to differentiate themselves in a competitive market.

https://www.gamespot.com/articles/evga-terminates-relationsh...


So not favorable to apple as a buyer.

Funnily enough Apple and Nvidia has old beef with one another, this especially led them to sever ties:

https://www.semiaccurate.com/2010/07/11/investigation-confir...


If your customers are known to be antagonistic to business partners, the correct answer is to diversify them as much as you can, even at reasonable costs from anything else.

That means deprioritizing your largest customer.


Fair I feel like that also speaks to nation+states trade policy.

Also theres the devil you know and the devil you dont know.


Yep, you can be close allies with a nation and have many shared interests, and even a trade deficit with them as we in Britain did, and then they stab you in the back with tariffs.

At these scales everyone is antagonistic, it comes with the territory.

That's what MBA schools teach you.

That's also a lie, it's only antagonistic when one of the sides is controlled by a psychopathic asshole, and it being antagonistic is a serious drag for the gains of both sides.


What company at these scales isn’t run by a psychopathic asshole? It also comes with the territory.

Even if Apple isn't very good at reciprocating faithful service from its suppliers, there's also the matter of how it treats suppliers who cause it problems instead.

Costco does not treat their suppliers well.

Do you have a source for this? Most information I’ve seen around this (e.g. Acquired podcast, from the Costco side) claims strong positive relationships.

Agreed TSMC can do whatever they want. in 2027 no other fabs will match what tsmc has today, anything that requires the latest process node is going to get more expensive, so your apple silicone and your AMD chips

As of today Intel is very around leading node

I'll believe it when I see it (at scale). I hope 18A is good enough as competition is good, and a weak Intel is bad for us all.

PTL is already released, on shelves in like 2 weeks.

Yes and it’s looking promising, but one mobile processor does not prove a nodes success at scale.

It definitely implies it though, I’m hopeful that competition is back.


Damn, you would think it would be priced in…

yield is more important than leading node.

Both are important

Yes and no. Sunk cost.

They are always balls deep, if it takes them 2 years to get a TSMC yield, with as much as demand it exists for high-end fabs, they could already easily get financing to already build even more capacity.

Now they have literally the US government as an investor.

One would be naive to believe that they wouldn't get at least a few hundred billion dollars to scale it up given the so many risks involved in most of US tech sector being dependent on Taiwan.


They also back-stab their business "partners."

Suppliers really hate working with Costco. They're slow to pay, allow for only small margins, and often need too high of a percentage of a businesses revenue, all of which is not friendly towards suppliers.

Not true at all. Costco uses the industry-standard Net 60 for supplier payment.

Companies have to be fairly large to be Costco suppliers. What suppliers lose in margin they more than make up for in scale. It's better to sell 10 million at 5% margin than 1 million at 10% margin.

And they don't require a % of supplier's business revenue as that would be illegal in the U.S. Most of the products found at Costco are generally found at other retailers, just in smaller packages or as different SKUs.


No public company will be loyal or nice to their suppliers. That is just not in the playbook for public companies. They have "fiduciary duty", not human duty.

Private companies can be nice to their suppliers. Owners can choose to stay loyal to suppliers they went to high school with, even if it isn't the most cost-efficient.


> they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.

I’m not saying you’re wrong but you’re previous paragraph sounding like you were wondering if it was the case vs. here you’re saying it’s known. Is this all true? Do they have a reputation for hammering their suppliers?


Apple is so notoriously ravenous for profit margin that they can’t not be that way.

It felt like a more confident statement and I was legitimately asking. I have little love for Apple. Ditched my Mac Studio earlier this year for a Linux only build after 20 years of being on Macs. I say this because I think folks think I am trying to sealion/“just ask questions:tm:” or some nonsense, when I am legitimately asking if this is a documented practice and what the extent is. I am not finding it easy to find info on this.

I imagine it is like becoming a supplier for McDonalds.

The penalties for not delivering on timelines and production goals, and the scale being requested can mean substantial changes to your business. I remember a friend whose company was in talks with Apple telling me that there was some sense of relief when the deal fell through, just because of how much stress and risk and change the deal would entail.

However, a missing component could put tens of billions of dollars of revenue on the line for Apple. It is easier to say that any supplier Apple picks has to then quickly grow to the scale and process needed - and failing to do that successfully could very well be a fatal slip for the supplier.

Even in the iPod days, Apple often would invest in building out the additional capacity (factories) to meet their projected demand, and have a period of exclusivity as well. This meant that as MP3 player demand scaled up, they also wound up locking up production for the micro HDD and flash ram that competitors would need.


Apple dealt exclusively with Chinese labor prices until they were directly threatened by the POTUS. You tell me.

I got a bridge to sell you if you think that Apple is going to bring any of their manufacturing to the US...

I've seen the leaked BOMs, I'm not dumb enough to think that Americans can match it.

Where do you find the leaked BOMs?

RedNote usually, before it's deleted.

https://www.bbc.com/news/articles/c86jx18y9e2o

Apple has responded and has started moving a lot of manufacturing out of China. It just makes sense for risk management.


From your article:

> Meanwhile, Vietnam will be the chief manufacturing hub "for almost all iPad, Mac, Apple Watch and AirPods product sold in the US".

> We do expect the majority of iPhones sold in US will have India as their country of origin," Mr Cook said.

Still not made in the US and no plan to change that. They will be selling products made in India/Vietnam domestically and products made in China internationally.

The tariffs are not bringing these jobs home.


Well, from your article:

> China will remain the country of origin for the vast majority of total products sold outside the US, he added.

And international sales are a solid majority of Apple's revenue.


It would be a $6000 phone if they built it in America.

Would be Interesting to know if it really would or not. Especially relative to their margins.

That depends on too many factors. Moving all production to the US would greatly reduce prices, since it costs a lot of money to setup a factory, but you amortize that over everything it produces. I don't know how the iphone is produced in China, but I have to believe it is highly automated as well. However moving a factory takes months (at best, China may not allow exporting it at all), and in those months Apply wouldn't be making any iphones, so to do production in the US requires building an all new factory which is going to be expensive.

You can buy modern CPUs made in Iowa - at about $60,000 each. You can buy one from an intel fab (I'm not sure where they are) for under $1000 that is likely better. the Iowa made CPU would be a one-off made under license from Intel. The companies that do this made just enough to prove they can in case Intel fabs are bombed. (I assume this means that you can't actually buy such a CPU if you tried, but they do make them and that is about the cost they would have to charge to break even)


I tend to agree with you, feels to me like the root of this is essentially whether foundries will "go all in" on AI like the rest of the S&P 500. But why trade away one trillion-dollar customer for another trillion-dollar customer if the first one is never going away, and the second one might?

I think it is less of a trade and more of a symbiotic capital cycle, if I can call it that?

Nvidia's willingness to pay exorbitant prices for early 2nm wafers subsidizes the R&D and the brutal yield-learning curve for the entire node. But you can't run a sustainable gigafab solely on GPUs...the defect density math is too punishing. You need a high-volume, smaller-die customer (Apple) to come in 18 months later, soak up the remaining 90% of capacity and amortize that depreciation schedule over a decade.


Isn’t the smaller die aspect more valuable early in the node’s maturity, where defects are less punishing?

That is the traditional textbook yield curve logic, if I'm not wrong? Smaller area = higher probability of a surviving die on a dirty wafer. But I wonder if the sheer margin on AI silicon basically breaks that rule? If Nvidia can sell a reticle-sized package for 25k-30k USD, they might be perfectly happy paying for a wafer that only yields 30-40% good dies.

Apple OTOH operates at consumer electronics price points. They need mature yields (>90%) to make the unit economics of an iPhone work. There's also the binning factor I am curious about. Nvidia can disable 10% of the cores on a defective GPU and sell it as a lower SKU. Does Apple have that same flexibility with a mobile SoC where the thermal or power envelope is so tightly coupled to the battery size?


I am curious about the binning factor too since in the past, AMD and Intel have both made use of defect binning to still sell usable chips by disabling cores. Perhaps Apple is able to do the same with their SoCs? It's not likely to be as granular as Nvidia who can disable much smaller areas of the silicon for each of their cores. On the other hand, the specifics of the silicon and the layout of the individual cores, not to mention the spread of defects over the die might mitigate that advantage.

They do bin their chips. Across the range (A- and M-series) they have the same chip with fewer / disabled cpu and gpu cores. You pray a premium for ones with more cores. Unsure about the chip frequencies - Apple doesn’t disclose those openly from what I know.

> They need mature yields (>90%) to make the unit economics of an iPhone work.

Sauce on the number?

iPhones are luxury goods with margins nowhere near typical for consumer electronics. Apple can easily stomach some short term price hikes / yield drops.


    > They need mature yields (>90%) to make the unit economics of an iPhone work.
Can you share how you know this information? >90% seems very specific.

I thought they binned CPUs for things like AppleTV and lower cost iPads?

Yeah, most of their chips have two or more bins with different core configs, and the lower bins probably use salvaged dies.

For example the regular M4 can have 4 P-cores / 6 E-cores / 10 GPU cores, or 3/6/10 cores, or 4/4/8 cores, depending on the device.

They even do it on the smaller A-series chips - the A15 could be 2/4/5, 2/4/4, or 2/3/5.


With current AI pricing for silicon, I think the math’s gone out the window.

For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads - and that’s after the node yields have been improved via the gazillion iPhone SoC dies.

NVIDIAs flexibility came from using some of those binned dies for GeForce cards, but the VRAM situation is clearly making that less important, as they’re cutting some of those SKUs for being too vram heavy relative to MSRP.


> For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads

The Pro and Max chips are different dies, and the Ultra currently isn't even the same generation as the Max. And the iPads have never used any of those larger dies.

> NVIDIAs flexibility came from using some of those binned dies for GeForce cards

NVIDIA's datacenter chips don't even have display outputs, and have little to no fixed-function graphics hardware (raster and raytracing units), and entirely different memory PHYs (none of NVIDIA's consumer cards have ever used HBM).


They’re binning within those product lines - both NVIDIA and Apple.

Not binning an M4 Max for an iPhone, but an M4 Pro with a few GPU or CPU cores disabled is clearly a thing.

Same for NVIDIA. The 4080 is a 4090 die with some SMs disabled.


> The 4080 is a 4090 die with some SMs disabled.

The desktop 4090 uses the AD102 die, the laptop 4090 and desktop 4080 use the AD103 die, and the laptop 4080 uses the AD104 die. I'm not at all denying that binning is a thing, but you and other commenters are exaggerating the extent of it and underestimating how many separate dies are designed to span a wide product line like GPUs or Apple's computers/tablets/phones.


There are levels inside pro, max, and ultra that might be the product of binning?

"Ultra" isn't even binned - it's just 2x "Max" chips connected together.

Otherwise, yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.

And even all of the above mentioned marketing names come in different core configurations. M4 Max can be 14 CPU Cores / 32 GPU cores, and it can also be 16 CPU cores and 40 GPU cores.

So yeah, I'd agree that Apple has _extreme_ binning flexibility. It's likely also the reason why we got A19 / A19 Pro / M5 first, and we still don't have M5 Pro or M5 Max yet. Yields not high enough for M5 Max yet.

Unfortunately I don't think they bin down even lower (say, to S chips used in Apple Watches), but maybe in the future they will.

In retrospect, Apple ditching Intel was truly a gamechanging move. They didn't even have to troll everyone by putting an Intel i9 into a chassis that couldn't even cool an i7 to boost the comparison figures, but I guess they had to hedge their bet.


> yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.

No, that's entirely wrong. All of those are different dies. The larger chips wouldn't even fit in phones, or most iPad motherboards, and I'm not sure a M4 Max or M4 Pro SoC package could even fit in a MacBook Air.

As a general rule, if you think a company might ever be selling a piece of silicon with more than half of it disabled, you're probably wrong and need to re-check your facts and assumptions.


No, I think you have it wrong.

There are two levels of Max Chip, but think of a Max as two pros on die (this is simplification, you can also think of as pro as being two normal cores tied together), so a bad max can't be binned into a pro. But a high-spec Max can be binned into a low-spec Max.


Datacenter GPU dies cannot be binned for Geforce because they lack fixed function graphics features. Raytracing acceleration in particular must be non-trivial area that you wouldn't want to spend on a datacenter die. Not to mention the data fabric is probably pretty different.

I’m not saying their binning between data center and 3060s, but within gaming and between gaming and RTX Pro cards, there’s binning.

As you cut SMs from a die you move from the 3090 down the stack, for instance. That’s yield management right there.


The A40, L40S and Blackwell 6000 Pro Server have RT cores. 3 datacenter GPUs.

If you want binning in action, the RTX ones other than the top ones, are it. Look for the A30 too, of which I was surprised there was no successor. Either they had better yields on Hopper or they didn't get enough from the A30...


Are Apple's profit margins lower than Nvidia's?

Why are foundries going 'All In' on AI? They fab chips for customers, doesnt matter what chips they are and who the customer is.'Who will pay the most for us to make their chips first' is the only question TMSC will be asking. The market of the customer is irrelevant.

AI capex may or may not flatten in the near future (and I don't necessarily see a reason why it would). But smartphone capex already has.

Like smartphones, AI chips also have a replacement cycle. AI chips depreciate quickly -- not because the old ones go bad, but because the new ones are so much better in performance and efficiency than the previous generation. While smartphones aren't making huge leaps every year like they used to, AI chips still are -- meaning there's a stronger incentive to upgrade every cycle for these chips than smartphone processors.


> AI chips depreciate quickly -- not because the old ones go bad

I've heard that it's exactly that, reports of them burning out every 2-3 years. Haven't seen any hard numbers though.


Lifetime curve is something they can control. If they can predict replacement rate, makes sense to make chips go bad on the same schedule, saving on manufacturing costs.

> the smartphone replacement cycle is the only predictable cash flow

people are holding onto their phones for longer: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...


Still more predictable than GPU buys in the current climate. Power connector melting aside, GPUs in most cases get replaced less frequently than cell phones, unless of course you have lots of capital/profit infusion to for whatever reason stay ahead of the game.

Heck if Apple wanted to be super cheeky, they could probably still pivot on the reserved capacity to do something useful (e.x. revised older design for whatever node they reserved where they can get more chips/wafer for cheaper models.)

NVDA on the other hand is burning a lot of good-will in their consumer space, and if a competitor somehow is able to outdo them it could be catastrophic.


Yea, it’s anecdata, but I only replaced my 1080 ti about 1.5 years ago.

Graphical fidelity is at the point that unless some new technology comes out to take advantage of GPUs, I don’t see myself ever upgrading the part. Only replacing it whenever it dies.

And that 1080 ti isn’t dead either, I passed the rig onto someone who wanted to get into PC gaming and it’s still going strong. I mostly upgraded because I needed more ram and my motherboards empty slots were full of drywall dust.

The phone I’m more liable to upgrade solely due to battery life degradation.


I replaced my 1080 Ti recently too (early 2025). I had kept it as my daily GPU since 2017. It was still viable and not in urgent need of a replacement, even though my 1080 Ti is an AIO liquid cooled model from EVGA, so I'm surprised it hasn't leaked yet. It's been put through a lot of stress from overclocking too, and now it lives on inside a homelab server.

The 5090 I replaced it with has not been entirely worth it. Expensive GPUs for gaming have had more diminishing returns on improving the gaming experience than ever before, at least in my lifetime.


Nvidia have been using TSMC since the Riva 128. That's before Apple started making any of their own silicon. GPUs are easily as predictable as mobile phones.

> GPUs are easily as predictable as mobile phones

They really, absolutely, are not.

It's not about "will there be a new hardware", it's about "is their order quantity predictable"


"Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?"

That's the take I would pursue if I were Apple.

A quiet threat of "We buy wafers on consumer demand curves. You’re selling them on venture capital and hype"


Why should that change TSMC decision making even a little?

The reality is that TSMC has no competition capable of shipping an equivalent product. If AI fizzles out completely, the only way Apple can choose to not use TSMC is if they decide to ship an inferior product.

A world where TSMC drains all the venture capital out of all the AI startups, using NVidia as an intermediary, and then all the bubble pops and they all go under is a perfectly happy place for TSMC. In these market conditions they are asking cash upfront. The worst that can happen is that they overbuild capacity using other people's money that they don't have to pay back, leaving them in an even more dominant position in the crash that follows.


Because apple can play hard(er) ball in 12 or 18 or 24 months when this (likely) irrational spend spree dies?

Business is a little more nuanced than this audience thinks, and it’s silly to think Apple has no leverage.


Nvidia is not a venture capital outlet. They are a self-sustaining business with several high-margin customers that will buy out their whole product line faster than any iPhone or Mac.

From TSMC's perspective, Apple is the one that needs financial assistance. If they wanted the wafers more than Nvidia, they'd be paying more. But they don't.


> several high-margin customers

This is the "venture capital and hype" being referred to, not Nvidia themselves.


Thanks. I didn't think my comment was super nuanced.

But Nvidia has had high-profile industry partners for decades. Nintendo isn't "venture capital and hype" nor is PC gaming and HPC datacenter workloads.

That line is purified cope.


But Nvidia wasn't able to compete with Apple for capacity on new process nodes with Nintendo volumes (the concept is laughable; compare Apple device unit volumes to game console unit volumes). What has changed in the semiconductor industry is overwhelming demand for AI focused GPUs, and that is paid for largely with speculative VC money (at this point, at least; AI companies are starting to figure out monetization).

Except AMD would be happy to take up any excess unused capacity that TSMC has to compete with Intel and nVidia.

Regardless of that, fab industry is based on a short and mid term auction-like planning.

If Nvidia pays more, Apple has to match.


> Regardless of that, fab industry is based on a short and mid term auction-like planning

Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.

You can't let all your other customers die just because Nvidia is flush with cash this quarter...


> die

Is the argument that Apple will go out of business? AAPL?

Wait,

> one player has a short-term ability to vastly outspending all the rest.

I assure you, Apple has the long-term and short-term ability to spend like a drunken sailor all day and all night, indefinitely, and still not go out of business. Of course they’d prefer not to. But there is no ‘ability to pay’ gap here between these multi-trillion-dollar companies.

Apple will be forced to match or beat the offer coming from whoever is paying more. It will cost them a little bit of their hilariously-high margins. If they don’t, they’ll have to build less advanced chips or something. But their survival is not in doubt and TSMC knows that.


That's exactly how it is supposed to work and Apple has outspent competitors for ages getting prio.

TSMC isn't running a charity, it sells capacity to the highest bidder.

Of course customers as big as Apple will have a relationship and insane volumes that they will be guaranteed important quotes regardless.


Why should it be short term, though?

If it takes 4 years to build a new fab and Apple is willing to commit to paying pay the price of an entire fab, for chips to be delivered in 4 years time - why not take the order and build the capacity?


I mean, these things are likely already written down and Apple still gets lots of capacity for the reasons you mention.

But Nvidia has also spent billions/year in TSMC for more than a decade and this just keeps increasing.


> Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.

Well yeah, people were identifying that back when Apple bought out the entirety of the 5nm node for iPhones and e-tchotchkes. It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight.


> It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight

It's not "build better hardware" though, it's "continue to ship said hardware for X number of years". If someone buys out the entire fab capacity and then goes under next year, TSMC is left holding the bag


It's not that, either. Low-margin, high-volume contracts are the worst business you can take. It devalues TSMC's work and creates an unnatural downward force on the price of cutting-edge silicon. By ignoring Apple's demands they're creating natural competition that raises the value of their entire portfolio.

It really is about making better hardware. Apple would be out-bidding Nvidia right now, but only if the iPhone had equivalent value-add to Nvidia hardware. Alas, iPhones are overpriced and underpowered, most people will agree.


> but only if the iPhone had equivalent value-add to Nvidia hardware... iPhones are overpriced and underpowered, most people will agree

I'd argue this from almost the opposite direction - there is no value-add for Apple because high-end smartphones exceeded the performance requirements of their user-base generations ago.

Nvidia has a pretty much infinite performance sink here (at least as long as training new LLMs remains a priority for the industry as a whole). On the smartphone side, there just isn't the demand for drastic performance increases - and in practice, many of us would like power and cost reduction to be prioritised instead.


I would also bet significant money that Apple's unique market position will give them the confidence to invest in in-house fabrication before 2030.

The R&D and equipment cost for fabrication continues to be closer to exponential growth - which is why so many players have gotten out of the game, why companies with fabs like Samsung and Intel still use TSMC for some parts, and why even Intel is now trying to justify the cost of new processes by becoming a contract fab.

I can certainly see Apple taking a large stake and board position in fabricators, but I can't see them being able to justify the ongoing investment in a closed fab.


Would it be feasible for them to buy Intel instead? Starting your own foundry would likely take over a decade.

Yup; or potentially just purchasing a fab from them, given that Intel has signaled they want to leverage TSMC more, and much of Intel's remaining value is wrapped up in server-grade chips that Apple wouldn't be interested in.

But also; Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments. The iPhone wasn't an obvious success for 5 or 6 years. They started designing their own iPhone chips ~the iPhone 4 iirc, and pundits remarked: this isn't a good idea; today, the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world, by 25%, at a tenth the power draw and no active cooling (e.g. 9950X3D). Apple Maps (enough said). We're seeing similar investments today, things we could call "failures" that in 10 years we'll think were obviously going to be successful (cough vision pro).


> Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments.

Definitely! But I'd recon they would want to bootstrap that part of their supply chain as soon as possible? Say China does invade Taiwan, suddenly their main supplier is gone and the Intel capacity mostly goes to military and other high margin segments. If they instead own Intel they not only control the narrative but also capitalize on the increase in Intel's value.


> the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world

No, it does not. The core inside the M5 is faster than every other core design in single-threaded burst performance. That is common for small machines with a low core count and no hyperthreading.

The chip itself does not outperform every other chip in the world, nor is it 10x more efficient than the 9950X3D. That's not even napkin math at that point, you're making up numbers with no relation to relevant magnitude.


The 9950X3D has a TDP of 170 watts. M5 has an estimated TDP of around 20 watts.

The comparison point was for single core performance, which certainly makes the TDP comparison unfair if interpreted together. The numbers are ballpark-correct.

No one else is remotely close to Apple. Apple could stop developing chips for four years, and it’s very likely they would still ship the most efficient core architecture, and sit in the top five in performance. If you’re quibbling over the semantics of this particular comparison, you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months.


> The numbers are ballpark-correct.

The numbers do not exist in isolation. They are "interpreted together" because statistics are more than just advertisement lines. The TDP comparison is mind-bogglingly stupid and you should really feel ashamed for defending it if you care about statistical integrity.

> you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months

I hope so. The past Ultra chips have been losing to Nvidia laptops in raster and compute efficiency.


Is Snapdragon with the X2 Elite so far behind?

I doubt it, particularly not four years.


Can you buy and independently test a Snapdragon X2 Elite? You can go buy M5 today.

Apple could afford Intel, and could get past antitrust by arguing military security. Who's mobile phone can politicians trust?

Then again, Microsoft should have bought Intel: MS has roughly $102 billion in cash (+ short-term investments). Intel’s market value is approximately $176 billion. Considering Azure, Microsoft has heaps of incentive to buy Intel.

I would guess Google are more likely to greenfield develop their own foundry rather than try and buy Intel.


> and could get past antitrust by arguing military security

Antitrust would certainly block Apple specifically for this reason. Apple is not a credible supplier of DoD hardware and acquiring IFS would complicate their status as a Trusted Foundry.

If Apple had more time to reform their image and invest in MIL-STD processes then maybe it would work. As-is, I'd be shocked if the US let Intel become the victim of a hostile takeover. Even for a company as important as Apple.


They could do it, but I wonder if it makes sense financially. It's probably easier for a neutral foundry like TSMC to recoup the costs by selling the capacity to whomever for years to come. Apple probably isn't interested in getting in the foundry business, so they'd be the ones who'd have to use all the capacity a production line has as long as it's running.

Very much this.

Apple has to price in the risk of the US government forcing their hand in various ways. They have a negotiating disadvantage.

On the other hand, it's not like Apple can just switch fabs without any cost or difficulty. Sure, TSMC is undoubtedly happy to have a customer with predictable needs, but Apple is also subject to some level of lock-in.

I doubt that we will hit diminishing returns in AI. We still find new ways to make them faster or cheaper or better or even train themselves...

The flat line prediction is now 2 years old...


Many things that look exponential originally turn out to actually be sigmoidal.

I consider the start of this wave of AI to be approximately the 2017 Google transformer paper and yet transformers didn't really have enough datapoints to look exponential until GPT 3 in 2022.

The following is purely speculation for fun and sparking light-hearted conversation:

My gut feeling is that this generation of models transitioned out of the part of the sigmoid that looks roughly exponential after the introduction of reasoning models.

My prediction is that tranformer-based models will start to enter the phase that asymptotes to flatline in 1-2 years.

I leave open the possibility for a different form of model to emerge that is exponential but I don't believe transformers to be right now.


Feels like top of s curve lately

I thought the prediction was that the scaling of LLMs making them better would plateau, not that all advancement would stop? And that has pretty much happened as all the advancements over the last year or more have been architectural, not from scaling up.

You say that, but to me they seem roughly the same as they've been for a good while. Wildly impressive technology, very useful, but also clearly and confidently incorrect a lot. Most of the improvement seems to have come from other avenues - search engine integration, image processing (still blows my mind every time I send a screenshot to a LLM and it gets it) and stuff like that.

Sure maybe they do better in some benchmarks, but to me the experience of using LLMs is and has been limited by their tendency to be confidently incorrect which betrays their illusion of intelligence as well as their usefulness. And I don't really see any clear path to getting past this hurdle, I think this may just be about as good as they're gonna get in that regard. Would be great if they prove me wrong.


Deepseek, Nvidia and meta are pumping out one paper after another.

New and better things are coming. They will just take time to implement, and I doubt they cancel current training runs. So I guess it will take up to a year for the new things to come out

Can the bubble burst in this time, because people lose patience? Of course. But we are far from the end.


Papers published does not a convincing "AI" make. But no point to this really, we'll see what happens

That's a really hilarious take given Nvidia's history with TSMC.

Apple was favored by TSMC because they brought TSMC more money. Now Nvidia is bringing TSMC more money.

[flagged]


Louis Vuitton didn't make 18% of all handbags in 2024.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: