Backpropagation through structure
Backpropagation Through Structure (BPTS) is a gradient-based technique for training recursive neural nets (a superset of recurrent neural nets) and is extensively described in a 1996 paper written by Christoph Goller and Andreas Küchler.[1]
References
- Kuchler, Andreas (1996). "Learning Task-Dependent Distributed Representations by Backpropagation Through Structure". Proceedings of International Conference on Neural Networks (ICNN'96). 1. pp. 347–352. CiteSeerX 10.1.1.49.1968. doi:10.1109/ICNN.1996.548916. ISBN 0-7803-3210-5. S2CID 6536466.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.