3 years ago

On the Convergence Rate of Training Recurrent Neural Networks.

Zeyuan Allen-zhu, Yuanzhi Li, Zhao Song

Despite the huge success of deep learning, our understanding to how the non-convex neural networks are trained remains rather limited. Most of existing theoretical works only tackle neural networks with one hidden layer, and little is known for multi-layer neural networks.

Recurrent neural networks (RNNs) are special multi-layer networks extensively used in natural language processing applications. They are particularly hard to analyze, comparing to feedforward networks, because the weight parameters are reused across the entire time horizon.

We provide arguably the first theoretical understanding to the convergence speed of training RNNs (when activation functions are present). Specifically, when the weights are randomly initialized, and when the number of neurons is sufficiently large ---meaning polynomial in the training data size and the time horizon--- we show that gradient descent and stochastic gradient descent both minimize the regression loss in a linear convergence rate, that is, $\varepsilon \propto e^{-\Omega(T)}$.

Publisher URL: http://arxiv.org/abs/1810.12065

DOI: arXiv:1810.12065v2

You might also like
Discover & Discuss Important Research

Keeping up-to-date with research can feel impossible, with papers being published faster than you'll ever be able to read them. That's where Researcher comes in: we're simplifying discovery and making important discussions happen. With over 19,000 sources, including peer-reviewed journals, preprints, blogs, universities, podcasts and Live events across 10 research areas, you'll never miss what's important to you. It's like social media, but better. Oh, and we should mention - it's free.

  • Download from Google Play
  • Download from App Store
  • Download from AppInChina

Researcher displays publicly available abstracts and doesn’t host any full article content. If the content is open access, we will direct clicks from the abstracts to the publisher website and display the PDF copy on our platform. Clicks to view the full text will be directed to the publisher website, where only users with subscriptions or access through their institution are able to view the full article.