Alternative Title

Stochastic Orthogonalization and Its Application to Machine Learning

Abstract

Orthogonal transformations have driven many great achievements in signal processing. They simplify computation and stabilize convergence during parameter training. Researchers have introduced orthogonality to machine learning recently and have obtained some encouraging results. In this thesis, three new orthogonal constraint algorithms based on a stochastic version of an SVD-based cost are proposed, which are suited to training large-scale matrices in convolutional neural networks. We have observed better performance in comparison with other orthogonal algorithms for convolutional neural networks.

Degree Date

Fall 12-21-2019

Document Type

Thesis

Degree Name

M.S.E.E.

Department

Electrical Engineering

Advisor

Scott C. Douglas

Subject Area

Computer Science

Notes

Orthogonal transformations have driven many great achievements in signal processing. They simplify computation and stabilize convergence during parameter training. Researchers have introduced orthogonality to machine learning recently and have obtained some encouraging results. In this thesis, three new orthogonal constraint algorithms based on a stochastic version of an SVD-based cost are proposed, which are suited to training large-scale matrices in convolutional neural networks. We have observed better performance in comparison with other orthogonal algorithms for convolutional neural networks.

Number of Pages

46

Format

.pdf

Creative Commons License

Creative Commons Attribution-Noncommercial 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial 4.0 License

Share

COinS