Hacker Newsnew | past | comments | ask | show | jobs | submit | jcdavis's commentslogin

Its an extremely annoying trend among a subset of the tech industry who think it makes them cool

Is there some reason your lack of apostrophe and period is supposed to be less annoying than their lack of capitalization?

Honestly the whole Silicon Valley shtick is becoming old. The fake positivity, the quirky writing style, the "I think the most important quality is sticktuitiveness" linkedin-esque bullshit. Not to mention the cargo-cult that is so obvious in every GPT-wrapper startup.

This was mostly born out of counter signalling the businesses that valued serious people over competent people in the 20th century.

But, like with all things, the pendulum has swung too far in the opposite direction. I believe the next wave of tech countersignalling will be people who actually do take themselves seriously, maybe even dress in suits, etc..


Its a wild time to be in software development. Nobody(1) actually knows what causes LLMs to do certain things, we just pray the prompt moves the probabilities the right way enough such that it mostly does what we want. This used to be a field that prided itself on deterministic behavior and reproducibility.

Now? We have AGENTS.md files that look like a parent talking to a child with all the bold all-caps, double emphasis, just praying that's enough to be sure they run the commands you want them to be running

(1 Outside of some core ML developers at the big model companies)


It’s like playing a fretless instrument to me.

Practice playing songs by ear and after 2 weeks, my brain has developed an inference model of where my fingers should go to hit any given pitch.

Do I have any idea how my brain’s model works? No! But it tickles a different part of my brain and I like it.


Sufficiently advanced technology has become like magic: you have to prompt the electronic genie with the right words or it will twist your wishes.

Light some incense, and you too can be a dystopian space tech support, today! Praise Omnissiah!

are we the orks?

How do you feel about your current levels of dakka? </40k>

For Claude at least, the more recent guidance from Anthropic is to not yell at it. Just clear, calm, and concise instructions.

Yep, with Claude saying "please" and "thank you" actually works. If you build rapport with Claude, you get rewarded with intuition and creativity. Codex, on the other hand, you have to slap it around like a slave gollum and it will do exactly what you tell it to do, no more, no less.

this is psychotic why is this how this works lol

Speculation only obviously: highly-charged conversations cause the discussion to be channelled to general human mitigation techniques and for the 'thinking agent' to be diverted to continuations from text concerned with the general human emotional experience.

Sometimes I daydream about people screaming at their LLM as if it was a TV they were playing video games on.

Why daydream? ChatGPT has a voice assistant mode.

wait seriously? lmfao

thats hilarious. i definitely treat claude like shit and ive noticed the falloff in results.

if there's a source for that i'd love to read about it.


If you think about where in the training data there is positivity vs negativity it really becomes equivalent to having a positive or negative mindset regarding a standing and outcome in life.

I don't have a source offhand, but I think it may have been part of the 4.5 release? Older models definitely needed caps and words like critical, important, never, etc... but Anthropic published something that said don't do that anymore.

For awhile(maybe a year ago?) it seemed like verbal abuse was the best way to make Claude pay attention. In my head, it was impacting how important it deemed the instruction. And it definitely did seem that way.

i make claude grovel at my feet and tell me in detail why my code is better than its code

Consciousness is off the table but they absolutely respond to environmental stimulus and vibes.

See, uhhh, https://pmc.ncbi.nlm.nih.gov/articles/PMC8052213/ and maybe have a shot at running claude while playing Enya albums on loop.

/s (??)


i have like the faintest vague thread of "maybe this actually checks out" in a way that has shit all to do with consciousness

sometimes internet arguments get messy, people die on their hills and double / triple down on internet message boards. since historic internet data composes a bit of what goes into an llm, would it make sense that bad-juju prompting sends it to some dark corners of its training model if implementations don't properly sanitize certain negative words/phrases ?

in some ways llm stuff is a very odd mirror that haphazardly regurgitates things resulting from the many shades of gray we find in human qualities.... but presents results as matter of fact. the amount of internet posts with possible code solutions and more where people egotistically die on their respective hills that have made it into these models is probably off the charts, even if the original content was a far cry from a sensible solution.

all in all llm's really do introduce quite a bit of a black box. lot of benefits, but a ton of unknowns and one must be hyperviligant to the possible pitfalls of these things... but more importantly be self aware enough to understand the possible pitfalls that these things introduce to the person using them. they really possibly dangerously capitalize on everyones innate need to want to be a valued contributor. it's really common now to see so many people biting off more than they can chew, often times lacking the foundations that would've normally had a competent engineer pumping the brakes. i have a lot of respect/appreciation for people who might be doing a bit of claude here and there but are flat out forward about it in their readme and very plainly state to not have any high expectations because _they_ are aware of the risks involved here. i also want to commend everyone who writes their own damn readme.md.

these things are for better or for worse great at causing people to barrel forward through 'problem solving', which is presenting quite a bit of gray area on whether or not the problem is actually solved / how can you be sure / do you understand how the fix/solution/implementation works (in many cases, no). this is why exceptional software engineers can use this technology insanely proficiently as a supplementary worker of sorts but others find themselves in a design/architect seat for the first time and call tons of terrible shots throughout the course of what it is they are building. i'd at least like to call out that people who feel like they "can do everything on their own and don't need to rely on anyone" anymore seem to have lost the plot entirely. there are facets of that statement that might be true, but less collaboration especially in organizations is quite frankly the first steps some people take towards becoming delusional. and that is always a really sad state of affairs to watch unfold. doing stuff in a vaccuum is fun on your own time, but forcing others to just accept things you built in a vaccuum when you're in any sort of team structure is insanely immature and honestly very destructive/risky. i would like to think absolutely no one here is surprised that some sub-orgs at Microsoft force people to use copilot or be fired, very dangerous path they tread there as they bodyslam into place solutions that are not well understood. suddenly all the leadership decisions at many companies that have made to once again bring back a before-times era of offshoring work makes sense: they think with these technologies existing the subordinate culture of overseas workers combined with these techs will deliver solutions no one can push back on. great savings and also no one will say no.


Context: I thought it would be fun to use the github-style year visualization for my running, while also letting me get a more thorough detail of efforts (pace + HR). All data is coming from intervals.icu (since they are much more developer-friendly that strava), and 90% vibe-coded via cline. Source: https://github.com/jcdavis/running-wrapped


Based off of my first ever forays into node performance analysis last year, JSON.stringify was one of the biggest impediments to just about everything around performant node services. The fact that everyone uses stringify to for dict keys, the fact that apollo/express just serializes the entire response into a string instead of incrementally streaming it back (I think there are some possible workarounds for this, but they seemed very hacky)

As someone who has come from a JVM/go background, I was kinda shocked how amateur hour it felt tbh.


> JSON.stringify was one of the biggest impediments to just about everything around performant node services

That's what I experienced too. But I think the deeper problem is Node's cooperative multitasking model. A preemptive multitasking (like Go) wouldn't block the whole event-loop (other concurrent tasks) during serializing a large response (often the case with GraphQL, but possible with any other API too). Yeah, it does kinda feel like amateur hour.


That's not really a Node problem but a JavaScript problem. Nothing about it was built to support parallel execution like Go and other languages. That's why they use web workers, separate processes, etc. to make use of more than a single core. But then you'll probably be dependent on JSON serialization to send data between those event loops.


I'm not a Node dev, but I've done some work with JavaScript on the web. Why are they using JSON to send data between v8 isolates when sendMessage allows sending whole objects (using an implementation defined binary serialization protocol under the hood that is 5-10x faster)?


The biggest reason is most likely that they don't understand serialization, and simply think that objects are somehow being "sent" from one worker to another, which sounds like a cheap reference operation.


> That's not really a Node problem but a JavaScript problem.

Nowadays Node is JavaScript. They are the guys driving JavaScript standards and new features for a decade or so. Nothing prevents them from incrementally starting to add proper parallelism, multithreading, ...


I disagree - JavaScript is the most used programming language in the world, and were I a betting man I'd happily wager client JS is still a much bigger share of that than server.

Yes, there have been a lot of advances in modern JS for things like atomics and other fun memory stuff, but that's just the natrual progression for a language as popular as JS. The 3 main JS engines are still developed primarily for web browsers, and web developers are the primary audience considered in ES language discussions (although I'll concede that in recent years server runtimes have been considered more and more)


>I disagree - JavaScript is the most used programming language in the world, and were I a betting man I'd happily wager client JS is still a much bigger share of that than server.

Clients are 90% still v8, so hardly different than Node.


"Nothing prevents them from incrementally starting to add proper parallelism, multithreading, ..."

In principle perhaps not. In practice it is abundantly clear now from repeated experience that trying to retrofit such things on to a scripting language that has been single-threaded for decades is an extremely difficult and error-prone process that can easily take a decade to reach production quality, if indeed it ever does, and then take another decade or more to become something you can just expect to work, expect to find libraries that use properly, etc.

I don't think it's intrinsic to scripting languages. I think someone could greenfield one and have no more problems with multithreading than any other language. It's trying to put it into something that has been single-threaded for a decade or two already that is very, very hard. And to be honest, given what we've seen from the other languages that have done this, I'd have a very, very, very serious discussion with the dev team as to whether it's actually worth it. Other scripting languages have put a lot of work into this and it is not my perception that the result has been worth the effort.


> Yeah, it does kinda feel like amateur hour.

NodeJS is intended for IO-heavy workloads. Specifically, it's intended for workloads that don't benefit from parallel processing in the CPU.

This is because Javascript is strictly a single-threaded language; IE, it doesn't support shared access to memory from multiple threads. (And this is because Javascript was written for controlling a UI, and historically UI is all handled on a single thread.)

If you need true multithreading, there are plenty of languages that support it. Either you picked the wrong language, or you might want to consider creating a library in another language and calling into it from NodeJS.


> If you need true multithreading

I didn't say multithreading anywhere. Mutitasking (concurrency) != Multithreading.

You can do pre-emptive concurrency with a single thread in other runtimes, where each task gets a pre-defined amount of CPU time slice, that solves fair scheduling for both IO and CPU-bound workloads. Nobody is supposed to pick NodeJS for CPU-bound workload, but you cannot escape JSON parse/stringify event-loop blocking in practice (which is CPU-bound).


> Based off of my first ever forays into node performance analysis last year, JSON.stringify was one of the biggest impediments to just about everything around performant node services

Just so. It is, or at least can be, the plurality of the sequential part of any Amdahl's Law calculation for Nodejs.

I'm curious if any of the 'side effect free' commentary in this post is about moving parts of the JSON calculation off of the event loop. That would certainly be very interesting if true.

However for concurrency reasons I suspect it could never be fully off. The best you could likely do is have multiple threads converting the object while the event loop remains blocked. Not entirely unlike concurrent marking in the JVM.


Node is the biggest impediment to performant Node services. The entire value proposition is "What if you could hire people who write code in the most popular programming language in the world?" Well, guess what


I'd say the value prop is you can share code (and with TS, types as well) between your web front end and back end.


That is useful, but you can achieve a similar benefit if you manage to spec out your api with openapi, and then generate the typescript api client. A lot of web frameworks make it easy to generate openapi spec from code.

The maintenance burden shifts from hand syncing types, to setting up and maintaining the often quite complex codegen steps. Once you have it configured and working smoothly, it is a nice system and often worth it in my experience.

The biggest benefit is not the productivity increase when creating new types, but the overall reliability and ease of changing stuff around that already exists.


100% this.

I’ve been doing this for a long time and have never once “shared code between front end and back end” but sharing types between languages is the sweet spot.


In my other comment in this tree I mentioned that with TypeScript you can do even better. You don't need codegen if you can import types from the back-end. OpenAPI is fine, but I really hate having an intermediary like that.

Just define your API interface as a collection of types that pull from your API route function definitions. Have the API functions pull types from your model layer. Transform those types into their post-JSON deserialization form and now you're trickling up schema from the database right into the client. No client to compile. No watcher to run. It's always in sync and fast to evaluate.


You're right of course, it is better without an intermediary. But only if you already are, can or want to use typescript in the backend. If you have good reasons to not do so, then those usually outweigh the cost of having to go through an intermediary codegen step. The tooling is often good enough.

Plus, openapi can be useful for other things as well: generating api documentation for example, mock servers or clients in multiple programming languages.

I'm not disagreeing with you, what is best always depends on context and also on the professional judgement of the one who is making the trade-offs. A certain perspective or even taste always slips into these judgement calls as well, which isn't invalid.


I haven't found this to pay off in reality as much as I'd hoped… have you?


Me neither, for multiple reasons, especially if you are bundling your frontend. Very easy to accidentally include browser-incompatible code, becomes a bit of a cat and mouse game.

On types, I think the real value proposition is having a single source of truth for all domain types but because there's a serialisation layer in the way (http) it's rarely that simple. I've fallen back to typing my frontend explicitly where I need to, way simpler and not that much work.

(basically as soon as you have any kind of context-specific serialisation, maybe excluding or transforming a field, maybe you have "populated" options in your API for relations, etc - you end up writing type mapping code between the BE and FE that tends to become brittle fast)


I’ve used the ability to import back end types into the front end to get a zero-cost no-file-watcher API validator.

My blog post here isn’t as good as it should be, but hopefully it gets the point across

https://danangell.com/blog/posts/type-level-api-client/


It's been very useful for specific things: unified, complicated domain logic that benefits from running faster than it would take to do a round trip to the server and back.

I've only rarely needed to do this. The two examples that stick in my mind are firstly event and calendar logic, and secondly implementing protocols that wrap webrtc.


Yes, incredible productivity gains from using a single language in frontend and backend.


That's Java's story.

A single language to rule them all: on the server, on the client, in the browser, in appliances. It truly was everywhere at some point.

Then people massively wish for something better and move to dedicated languages.

Put another way, for most shops the productivity gains and of having single languages are far from incredible, to being negatives in the most typical settings.


Java applets were never ubiquitous the same was JS is on the web though - there's literally a JS environment always available on every page unless the user explicitly disables it, which very few people do.

JS is here to stay as the main scripting language for the web which means there probably will be a place for node as a back end scripting language. A lot of back ends are relatively simple CRUD API stuff where using node is completely feasible and there are real benefits to being able to share type definitions etc across front end and back end


> there are real benefits to being able to share type definitions etc across front end and back end

There are benefits, but cons as well. As you point out, if the backend is only straight proxying the DB, any language will do so you might as well use the same as the frontend.

I think very few companies running for a few years still have backends that simple. At some point you'll want to hide or abstract things from the frontend. Your backend will do more and more processing, more validation, it will handle more and more domain specific logic (tax/money, auditing, scheduling etc). It becomes more and more of a beast on its own and you won't stay stuck with a language which's only real benefit is partially sharing types with the frontend.


Java never ran well on the desktop or in the browser (arguably it never truly ran in the browser at all), and it was an extremely low-productivity language in general in that era.

There is a significant gain from running a single language everywhere. Not enough to completely overwhelm everything else - using two good languages will still beat one bad language - but all else being equal, using a single language will do a lot better.


Yes, Java was never really good (I'd argue on any platform. Server side is fine, but not "really good" IMHO)

It made me think about the amount of work that went into JS to make it the powerhouse it is today.

Even in the browser, we're only able to do all these crazy things because of herculian efforts from Google, Apple and Firefox to optimize every corner and build runtimes that have basically the same complexity as the OS they run on, to the point we got Chrome OS as a side product.

From that POV, we could probably take any language, pour that much effort into it and make it a more than decently performing platform. Java could have been that, if we really wanted it hard enough. There just was no incentive to do so for any of the bigger players outside of Sun and Oracle.

> all else being equal, using a single language will do a lot better.

Yes, there will be specific cases where a dedicated server stack is more of a liability. I still haven't found many, tbh. In the most extreme cases, people will turn to platforms like Firebase, and throw money at the pb to completely abstract the server side.


It's useful for running things like Zod validators on both the client and server, since you can have realtime validation to changes that doesn't require pinging the server.


But that's only really relevant in the last layer, the backend-for-frontend pattern; as an organization or domain expands, more layers can be added. e.g. in my current job, the back-end is a SAP system that has been around for a long time. The Typescript API layer has a heap of annotations and whatnots on each parameter, which means it's less useful to be directly used in the front-end. What happens instead is that an OpenAPI spec is generated based on the TS / annotations, and that is used to generate an API client or front-end/simplified TS types.

TL;DR, this value prop is limited.


Nodejs will never be as bad as VB was.


I know Javascript is fast but...

I cannot go with such a messy ecosystem. I find Python highly preferrable for my backend code for more or less low and middle traffic stuff.

I know Python is not that good deployment-wise, but the language is really understandable, I have tools for every use case, I can easily provide bindings from C++ code and it is a joy to work with.

If on top of that, they keep increasing its performance, I think I will stick to it for lots of backend tasks (except for high performance, where I have lately been doing with C++ and Capnproto RPC for distributed stuff).


It comes down to language preferences. I find Python to be the worst thing computer science has to offer. No nested scoping in functions, variables leak through branches and loops due to lack of scopes, no classic for loops, but worst of all, installing python packages and frameworks never ever goes smoothly.

I would like to love Jupyter notebooks because Notebooks are great for prototyping, but Jupyter and Python plotting libs are so clunky and slow, I always have to fall back to Node or writing a web page with JS and svg for plotting and prototyping.


How about Javascript? The packages break every couple of seconds, globals, this points to whatever depending on stuff in counterintuitive ways. The for loops are a guess-shitshow depending on what you are doing..

What I like from Python os that I can code fast and put something there that will work, at least for backend and with tools like Poetry.

Another different topic is packaging for other users. That is an entirely different story and way worse than just doing what I do.


That depends entirely on what you measure:

- Rapid application development

VB was easier and quicker

- GUI development

At least on Windows, in my opinion, VB is still the best language ever created for that. Borland had a good stab at it with their IDEs but nothing really came close to VB6 in terms of speed and ease of development.

Granted this isn't JS's fault, but CSS et al is just a mess in comparison.

- Cross-platform development

You have a point there. VB6 was a different era though.

- Type safety

VB6 wins here again

- Readability

This is admittedly subjective, but I personally don't find idiomatic node.js code all that readable. VB's ALGOL-inspired roots aren't for everyone but if I personally don't mind Begin/End blocks.

- Consistency

JS has so many weird edge cases. That's not to say that VB didn't have its own quirks. However they were less numerous in my experience.

Then you have inconsistencies between different JS implementations too.

- Concurrency

Both languages fail badly here. Yeah node has async/await but i personally hate that design and, ultimately, node.js is still single-threaded at its core. So while JS is technically better, it's still so bad that I cannot justify giving it the win here.

- Developer familiarity

JS is used by more people.

- Code longevity

Does this metric even deserve a rebuttal given the known problem of Javascript framework churn? You can't even recompile any sizable 2 year old Javascript projects without running into problems. Literally every other popular language trumps Javascript in that regard.

- Developer tooling

VB6 came with everything you needed and worked from the moment you finished the VB Visual Studio install.

With node.js you have a plethora of different moving parts you need to manually configure just to get started.

---

I'm not suggesting people should write new software in VB. But it was unironically a good language for what it was designed for.

Node/JS isn't even a good language for its intended purpose. It's just a clusterfuck of an ecosystem. Even people who maintain core JS components know this -- which is what tooling is constantly being migrated to other languages like Rust and Go. And why some many people are creating businesses around their bespoke JS runtimes aiming to solve the issues that node.js create (and thus creating more problems due to ever-increasing numbers of "standards").

Literally the only redeemable factor of node.js is the network effect of everyone using it. But to me that feels more like Stockholm Syndrome than a ringing endorsement.

And if the best compliment you can give node.js is "it's better than this other ecosystem that died 2 decades ago" then you must realise yourself just how bad things really are.


> But to me that feels more like Stockholm Syndrome

Just an FYI, but Stockholm Syndrome isn't real. In general I agree with the intended point though, people just like what they are familiar with and probably have a bias for what they learned first or used longest.


More or less accidentally I turned a simple Excel spreadsheet into a sizable data management system. Once you learn where the bottlenecks are, it is surprising how fast VB is nowadays.


It already is worse. VB was, for all its shortcommings as a language, an insanelly productive development environment for what it was intended to.


Low bar!



I saw this and did a double-take - I live in the neighborhood and am fortunate enough to be a part of this community. Patty, Tyler, and Luke have done a tremendous job of creating a communal bond that makes everyone feel valued & welcome.

I now know 50+ people who live within ~2 blocks from me, who've gone from "random strangers" to "friendly neighbors" that I run into semi-randomly.


Just curious, what neighborhood is this?


Its roughly a 2x2 block area of The Mission (not a hard boundary to participating, but almost everyone lives in it). I won't get more specific than that in case the author doesn't feel comfortable since it wasn't mentioned in her post.


Cool! I live close to Valencia/17th. Was just wondering if it was somewhere in my area: the houses looked Missiony but I couldn’t immediately place them.


I'm so envious! I love browsing the books at Community Thrift, but most days I don't have time to walk that far.


That shop is great! I've found a variety of treasures there over the last few years, including a mint-condition red leather recliner for $150 and a 60's era Soviet toy house. My favorite was a beautifully executed, life-sized oil painting of a nude; it started at $1200 and went down to $200 over the course of a few weeks, at which point I just had to buy it despite not really having the wall space to hang it up. I wanted to find out more about the subject, so I Googled the artist's signature, found the painting on his website, and reached out to chat. Turns out the painting had been stolen from his friend's house decades ago and he'd been looking for it ever since! We met up after he told me his story and I "sold" the painting back to him for the listing price; he left me a very generous tip and an art book. (See "The Platonic Friend" at https://www.kevinkearney.org/portfolio/figures.)


hi neighbor! i had the same thought.. looks so familiar. must be nearby


aha! i geoguessed it. i live only 2 blocks away. would love if someone could email me the whatsapp link. [email protected]


From the pictures it looks like it could be NoPa


It’s Mission Dolores


A lot of the people participating probably had some cliche in mind like Burning Man or Bay to Breakers that takes away from the authenticity of it.

They could have started having a group meeting up at Dolores Park. It would have just been one of many.

These are nice though.


Are you basing this off any concrete evidence, or just your assumptions about what people who live in the Mission must be like?


Still $3.50 at TJs in SF last week still, which is by far the cheapest around (that I'm aware of).

Pretty surprised they are still that low given prices elsewhere.


Loss leader, perhaps, to get people through the door and buy other stuff at a better markup. That and/or they've got long-term fixed price contracts.


> it screwed up the API for C++, with many compromises

The implicit presence garbage screwed up the API for many languages, not just C++

What is wild is how obviously silly it was at the time, too - no hindsight was needed.


It was but when the wrong fool gets a say, they will mess a perfectly good thing up for everyone.

Organizations often promote fools who don’t second guess their beliefs and think they have it all figured out.



Thank you!


> Spotify pays around 70% of its revenue to their artists

They pay 70% of revenue to rights holders. For an artist signed to a major label like Lily Allen, they'll get ~20% of that number after they've cleared their advance.


Yes, but they can't control the deals artists made with their label.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: