Hacker Newsnew | past | comments | ask | show | jobs | submit | ActorNightly's commentslogin

Wow spend 40k to get the same tokens/second in QWEN as you would on a 3090

I have a feeling that Mac fans obsess more about being able to run large models at unusably slow speeds instead of actually using said models for anything.


There is zero shot that a revolution is happening in United States.

Trump could literally go on TV and r*e a kid on stage in 4k, and most people would go to work and be like "damn what weird times are we living in"

If shit gets bad enough economically to where there are food shortages and such, you will see US split before any revolution happens.


2 things.

First, this "both sides bad" take isn't fooling anyone. Everyone sees through your bullshit that you are pro Trump. Like its easy to tell from just this comment, but if anyone thinks Im being super presumptions, feel free to looks at your comment history and you will see Im right.

https://news.ycombinator.com/item?id=46890675#46891294

Secondly, the shitty thing for you is that the conservatives in charge have shown themselves to be just very inept. They could have honesty just rode the rest of Trumps term in silence, and Trump would still have been very popular despite the tarrifs, but they had to fuck it up in the most grandeur way possible of starting a new war.

Which means that Republicans are going to lose the support of the average person who is clueless about politics, and can vote one way or another based on vibes, and

Which means Dems are likely going to take a lot of the power back. At which point, it will become socially acceptable to "punish" conservatives and pro Trump people. There is already work going on to process internet comments and extract patterns of speech to cross correlate them across varying accounts on social media to id certain people, and if id'ed you better believe your work, your family, your friends, and whomever else you are going to be connected to are going to get spammed and your life ruined as much as possible.

So Imma be the nice guy and tell you to tighten up you OPSEC because you are doing an extremely poor job at it.


>know people who are intelligent and appear to not be hateful SOBs that voted for the clowns, and would do so again.

They are not intelligent.

People seem to think that intelligence can be isolated. Its not. People can fake intelligence through memorizing a bunch of facts, but that's not intelligence. Every part in a persons mind influences every other part.

And its easy to test as well. Nobody who is hard conservative can answer this simple question - "What concrete, hard evidence would you need to see for you to realize you have been wrong and change your stance on which party you support?"

Its along the same lines that any stupid person doesn't realize they are stupid - if they did, they would know the differentiator between smart and stupid and thus can become smart.


>Apple been tightening that control over time. For a long time on MacOS X you could simply run apps. Then came notarisation, but you could still disable it. Now, even with a certificate, it still shows a dialog. I wish that apps that went through notarisation would simply run like the ones from the app store without a dialog showing.

The thing is, Apple has never been about developers, its main thing was to basically sell an image since its inception. A lot of people were excited about the iPhone when it first came out, and then they quickly realized how locked down it was, and how it didn't even have basic copy paste.

Even now, if you look at the AnE in the age of llms, all of it is locked down specifically because its only for Apple to use.


Don't waste time trying to run models locally.

Instead, take the advantage of Termux power, namely the fact that you can install things like Openclaw or Gemini-cli. Google Ai plus or Pro plans are actually really good value, considering they bundle it with storage.

https://www.mobile-hacker.com/2025/07/09/how-to-install-gemi...

There is also Termux:GUI with bindings for languages, which you can use to vibecode your own GUI app, which then can basically serve as an interface to an agent, an Termux API which lets you interface with the phone, including USB devices.

Furthermore, termux has the cloudflared package availble, which lets you use clouflared free ssh tunnels (as long as you have a domain name).

All put together, you can do some pretty cool things.


Yeah, lets add more cost and complexity in a cooling system so instead of 1 token per second we get 2 tokens per second, all of the price of one graphics card that can do 50+ tokens a second.

Apple fans never cease to amaze me.


The problem is that the hardware is still like $3000. Making anything run on Macs is an exercise in futility. And its a shame that people get duped into buying Macs for LLM inference.

$3000 for running a 397B total parameters model is quite a bargain. The Mac is being used for its access to fast internal storage here since that's the key bottleneck, you could probably achieve similar outcomes with conventional (even fairly low-end) iGPU/APU hardware plus a fast PCIe x4 5.0 SSD (which would also allow you to overlap SSD transfers with iGPU/APU compute), but the cost would also be in a similar range. (Unless you carefully chose low-end e.g. Intel hardware with proper PCIe x4 5.0 NVMe support - which is still quite uncommon, especially for laptops.)

If you want to flex on being able to run 397b parameter models at unusably slow tokens/second sure.

You can buy a 3090 for $2k, and run QWEN3.5 at 50+ token a second, and it will do everything you need, especially if you give it enough context.


I really don't know why people have such nostalgia for old Apple devices. Did people really enjoy clicking on some app, then waiting like 5 minutes while the cursor does the spinning thing as the ap opens?

It used to be that you were looked down on if you used an Apple device, because it meant you were more concerned with aesthetics rather than actual usability.


Experiences vary. Back when my computer was a circa 2000 CRT iMac DV, it was the nicest computer I’d used. Not the fastest, but also not nearly as much trouble as the crashy Win98 boxes I’d been exposed to at the time. It was more than enough for me to explore computing, and when OS X came along acquainted me with the *nix command line and “real” software development with its free bundled dev tools. It’s not an exaggeration to say that I probably wouldn’t be a dev today had it not been for that gumdrop of a computer.

As the sibling comment notes, the distinctive look helps too. I thought it was cool then and still like it today. It wasn’t everybody’s cup of tea but that’s exactly why it’s so appealing to those who like it.


I mean, I certainly remember win 95/98 BSOD back in the day, but like using applications was usable on windows computer. Whereas in middle school, the labs had those green iMac G3s that you had to wait forever for it to open anything.

It’s all relative, I suppose. The machine I had been using prior to that iMac had half the clock speed and an eighth as much memory, so the iMac felt speedy in comparison.

Later iterations of the iMac G3 also addressed some bottlenecks in the earlier models which might also factor in.


I also remember school Macs being slow and unreliable at times. I wonder how much of that was related to how they were provisioned, with network accounts and stuff to let the IT guy spy on you and lock your computer.

School computers will always suck, regardless of platform. Along with corporate computers.

The unique aesthetics give it immense nostalgia value. They literally don't make them like they used to.

Lets look at Java in modern day.

* Most mature Java project has moved to Kotlin.

* The standard build system uses gradle, which is either groovy or kotlin, which gets compiled to java which then compiles java.

* Log4shell, amongst other vulnerabilities.

* Super slow to adopt features like async execution

* Standard repo usage is terrible.

There is no point in using Java anymore. I don't agree that Rust is a replacement, but between Python, Node, and C/C++ extensions to those, you can do everything you need.


> Python, Node, and C/C++ extensions to those, you can do everything you need.

Or you can use Java and have libraries that cover almost anything provided in those languages, having access to a massive pool of labour when needed.

> * Log4shell, amongst other vulnerabilities.

As if no Python, JS, C/C++ libraries ever had vulnerabilities? That's a non-sequitur, every ecosystem has security issues, the most important aspect is how quickly they are fixed. Given Java's massive size, a lot of libraries see high usage, and are actively developed, so security patches are released quite quickly.

> * Standard repo usage is terrible.

What does this even mean? Standard library?

Java has its place, it's boring technology that gets things done, and let companies hire from a immense pool.

By the way, over 25 years of carreer I have professionally worked with Java, Scala, Kotlin, Clojure, Obj-C, Go, Python, Ruby, PHP, JS, even ASP 3.0, and some .NET (C# and F#). I'm not a Java purist but I call your arguments a bit bullshit, all of these languages have their places, strengths and weaknesses, the sooner you realise they are tools and if they are generally used perhaps there's something valuable about each of them, the sooner you stop wasting time trying to argue why "X sucks, use Y".

Use the best tool for the job, knowing more tools is never bad.


> Java and have libraries that cover almost anything provided in those languages,

This is pretty funny.

or example, the other day I wrote a menu for mac os using rumps. Simply pip install rumps, write code, run, boom Mac os menu. Let me know when I can do the equivalent for java, or any other "performant" language.

>As if no Python, JS, C/C++ libraries ever had vulnerabilities?

Comparing the severity of log4shell to any python vulnerability is beyond crazy.

You have the Apache foundation, pushing its logging library as the industry standard, and multiple people saw no problem with not only the idea of a log statement being able to execute arbitrary code from the internet, but also making it the default behavior.

If at that point, everyone would instantly abandon any software from Apache in Java, I would have more respect for Java devs. But of course, they can't, because the ecosystem is so small that there is no replacements, so everyone is forced to cuck out to Apache, and who knows what and when other idiotic decision they are going to make.

And as a reminder, this used to be a thing https://www.reddit.com/r/java/comments/19s23g/online_counter...

There are plenty of other issues to cover on Java, but the log4shell pretty much is indefensible. Even if Im wrong about everything else, my argument still stands on that alone.


Ok, don't use Java, it's fine :)

Its not about personal use, its about people getting facts wrong about Java.

Facts wrong about Java, that's quite a nothing sentence across this thread, innit?

> Most mature Java project has moved to Kotlin.

Demonstrably false, not even close

Re Gradle using groovy/kotlin: so what? Gradle is not a standard any more than Maven, and java is not primary used as a scripting language, so it makes sense that it has a different language for its config files? What's the deal here?

Show me a language without vulnerabilities.

It has virtual threads for quite some times and it is a much much better choice for most use cases than async.


>Demonstrably false, not even close

Are you arguing that the Android Ecosystem uses Java? Because it most certainly moved to Kotlin, and will soon move even off of that.

>Show me a language without vulnerabilities.

There is a scale of vulnerability severeness in terms of severity and how the vulnerability was introduced.

Most every language has libraries with bugs that can create vulnerability. Log4shell wasn't a bug - it was introduced intentionally without anyone at Apache looking at it and thinking that it was wrong, knowing that log4j is the most widely used logging library for java.


> Are you arguing that the Android Ecosystem uses Java

Now you are just arguing in bad faith. Who talks about Android, which historically lagged a decade behind OpenJDK and not particularly good at being up to date even today, so people moved to kotlin vs java without goddamn lambdas? Is that your argument? Especially that it's a tiny segment compared to the vastness of web backends. Are Google's, apple's, alibaba's backends, amazon cloud etc insignificant in your mind?

And you may want to browse the list of vulnerabilities, there are plenty interesting ones.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: