Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm wondering if there is a way to combine optimization of model weights in a neural net with a set of heuristics limiting the search space, as a sort of rules engine/decision tree integrated within ANN backprop training. Basically pruning irrelevant and redundant features early and focusing on more informative ones.


Yes, there are many approaches like that. In one approach they train a network and prune it, then mask the pruned weights and retrain from scratch a sparse network from the original untrained weights.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: