Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The underlying problem here is giving any model direct access to your primary system. The model should be working in a VM or container with limited privileges.

This is like saying it's safer to be exposed to dangerous carcinogenic fumes than nerve gas, when the solution is wearing a respirator.

Also what are you doing allowing someone else to prompt your local LLM?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: