Alternative Title
Stochastic Orthogonalization and Its Application to Machine Learning
Subject Area
Computer Science
Abstract
Orthogonal transformations have driven many great achievements in signal processing. They simplify computation and stabilize convergence during parameter training. Researchers have introduced orthogonality to machine learning recently and have obtained some encouraging results. In this thesis, three new orthogonal constraint algorithms based on a stochastic version of an SVD-based cost are proposed, which are suited to training large-scale matrices in convolutional neural networks. We have observed better performance in comparison with other orthogonal algorithms for convolutional neural networks.
Degree Date
Fall 12-21-2019
Document Type
Thesis
Degree Name
M.S.E.E.
Department
Electrical and Computer Engineering
Advisor
Scott C. Douglas
Number of Pages
46
Format
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial 4.0 License
Recommended Citation
Hong, Yu, "Stochastic Orthogonalization and Its Application to Machine Learning" (2019). Electrical Engineering Theses and Dissertations. 31.
https://scholar.smu.edu/engineering_electrical_etds/31
Notes
Orthogonal transformations have driven many great achievements in signal processing. They simplify computation and stabilize convergence during parameter training. Researchers have introduced orthogonality to machine learning recently and have obtained some encouraging results. In this thesis, three new orthogonal constraint algorithms based on a stochastic version of an SVD-based cost are proposed, which are suited to training large-scale matrices in convolutional neural networks. We have observed better performance in comparison with other orthogonal algorithms for convolutional neural networks.