Recurrent Neural Networks and Their Applications to RNA Secondary Structure Inference

Devin Willmott
Abstract: Recurrent neural networks (RNNs) are state of the art sequential machine learning tools, but have difficulty learning sequences with long-range dependencies due to the exponential growth or decay of gradients backpropagated through the RNN. Some methods overcome this problem by modifying the standard RNN architecure to force the recurrent weight matrix W to remain orthogonal throughout training. The first half of this thesis presents a novel orthogonal RNN architecture that enforces orthogonality of W...
This data repository is not currently reporting usage information. For information on how your repository can submit usage information, please see our documentation.