Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mean it's difficult to 'observe' gradient descent, there are no characteristic properties that you can identify without specifying the relative objective function. But most of the process theories from computational neuroscience are based on some form of gradient descent. Even if it's only implicit, you'll be able to describe the variables of the system as moving against the gradient of some function.

But yes, it's extremely unlikely that nature implements backpropagation directly, as it relies on non-local gradients.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: