* Growth chief of staff | SF | Full time | $125K–$195K + equity
* Applied AI Systems Engineer | SF, Remote (North America) | Full-time | $120-180K + equity
We’re mission-driven capitalists: making life-saving drugs more accessible, and building a $200B+ company—on the scale of Palantir or ServiceNow—serving the largest healthcare enterprises.
Traction: profitable and growing fast. Selling 7+ figure contracts to enterprise healthcare customers.
Team: built by exited founders, YC & MIT alum, ex-Tesla, ex-Google engineers.
Top investors: funded by elite Silicon Valley VCs who've backed unicorns like DoorDash, Lyft, and Mammoth Biosciences. And strategic healthcare investors with deep industry connections.
Outsized impact & opportunity: work at the intersection of agentic AI, healthcare transformation, and life-changing patient outcomes.
If you want to work on a team of A-player athletes, doing the best work of your career, and helping get life-saving drugs to the people who need them, apply here: https://jobs.ashbyhq.com/neon-health (and make sure to mention HN!)
Neon Health | AI agents in healthcare | Hiring Eng and BizOps | SF, Remote (North America)
We’re mission-driven capitalists: making life-saving drugs more accessible, and building a $200B+ company—on the scale of Palantir or ServiceNow—serving the largest healthcare enterprises.
Traction: profitable and growing fast. Selling 7+ figure contracts to enterprise healthcare customers.
Team: built by exited founders, YC & MIT alum, ex-Tesla, ex-Google engineers.
Top investors: funded by elite Silicon Valley VCs who've backed unicorns like DoorDash, Lyft, and Mammoth Biosciences. And strategic healthcare investors with deep industry connections.
Outsized impact & opportunity: work at the intersection of agentic AI, healthcare transformation, and life-changing patient outcomes.
If you want to work on a team of A-player athletes, doing the best work of your career, and helping get life-saving drugs to the people who need them, apply here: https://neonhealth.com/careers#open-positions (and make sure to mention HN!)
We’re mission-driven capitalists: making life-saving drugs more accessible, and building a $200B+ company—on the scale of Palantir or ServiceNow—serving the largest healthcare enterprises.
Traction: profitable and growing fast. Selling 7+ figure contracts to enterprise healthcare customers.
Team: built by exited founders, YC & MIT alum, ex-Tesla, ex-Google engineers.
Top investors: funded by elite Silicon Valley VCs who've backed unicorns like DoorDash, Lyft, and Mammoth Biosciences. And strategic healthcare investors with deep industry connections.
Outsized impact & opportunity: work at the intersection of agentic AI, healthcare transformation, and life-changing patient outcomes.
If you want to work on a team of A-player athletes, doing the best work of your career, and helping get life-saving drugs to the people who need them, apply here: https://dub.sh/YC250601 (and make sure to mention HN!)
Good question! Our frequent offsites allow us to get most of the benefits of a tight-knit in-person team, while still having flexibility to bring on teammates working outside the SF Bay Area.
TLDR:
- We have a preference for SF Bay Area-based teammates, but are open to remote – as long as they can work PST hrs.
- And we have frequent offsites to bring the team together.
Gross to see toxic attitudes normalized like this. These folks could all benefit from reading the 15 commitments of conscious leadership https://conscious.is/15-commitments
In particular the bit about taking responsibility for ones circumstances
> I commit to taking full responsibility for the circumstances of my life, and my physical, emotional, mental and spiritual wellbeing. I commit to support others to take full responsibility for their lives.
vs
> I commit to blaming others and myself for what is wrong in the world. I commit to being a victim, villain, or a hero and taking more or less than 100% responsibility.
> i commit to taking full responsibility for the circumstances of my life, and my physical, emotional, mental and spiritual wellbeing
No thanks. This is a uniquely neoliberalist perspective on the collective human experience. The sentiment is toxic because much of the circumstance of ones life in the 21st century is not under their control at all.
And even worse: after spending the time to infer what the user wants -- the software might guess wrong.
Therefore it probably makes sense to constrain the software's use cases -- either i/ to specific functions (e.g. product, eng)
or ii/ to specific roles across functions (e.g. management, support desk triage)
An added benefit of constraining use cases like this is that it's easier to market to a specific use case.