3 years ago

# Optimization of ReLU Neural Networks using Quotient Stochastic Gradient Descent.

Qiwei Ye, Tie-Yan Liu, Qi Meng, Shuxin Zheng, Wei Chen

It has been well known that neural networks with rectified linear hidden units (ReLU) as activation functions are positively scale invariant, which results in severe redundancy in their weight space (i.e., many ReLU networks with different weights are actually equivalent). In this paper, we formally characterize this redundancy/equivalence using the language of \emph{quotient space} and discuss its negative impact on the optimization of ReLU neural networks. Specifically, we show that all equivalent ReLU networks correspond to the same vector in the quotient space, and each such vector can be characterized by the so-called skeleton paths in the ReLU networks. With this, we prove that the dimensionality of the quotient space is $\#$weight$-\#$(hidden nodes), indicating that the redundancy of the weight space is huge. In this paper, we propose to optimize ReLU neural networks directly in the quotient space, instead of the original weight space. We represent the loss function in the quotient space and design a new stochastic gradient descent algorithm to iteratively learn the model, which we call \emph{Quotient stochastic gradient descent } (abbreviated as Quotient SGD). We also develop efficient tricks to ensure that the implementation of Quotient SGD almost requires no extra computations as compared to standard SGD. According to the experiments on benchmark datasets, our proposed Quotient SGD can significantly improve the accuracy of the learned model.

Publisher URL: http://arxiv.org/abs/1802.03713

DOI: arXiv:1802.03713v4

You might also like
Never Miss Important Research

Researcher is an app designed by academics, for academics. Create a personalised feed in two minutes.
Choose from over 15,000 academics journals covering ten research areas then let Researcher deliver you papers tailored to your interests each day.

Researcher displays publicly available abstracts and doesn’t host any full article content. If the content is open access, we will direct clicks from the abstracts to the publisher website and display the PDF copy on our platform. Clicks to view the full text will be directed to the publisher website, where only users with subscriptions or access through their institution are able to view the full article.