Convolutional Neural Network with Structured Filters

Series
Applied and Computational Mathematics Seminar
Time
Monday, April 16, 2018 - 1:55pm for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Xiuyuan Cheng – Duke University – xiuyuan.cheng@duke.eduhttps://services.math.duke.edu/~xiuyuanc/
Organizer
Wenjing Liao
Filters in a Convolutional Neural Network (CNN) contain model parameters learned from enormous amounts of data. The properties of convolutional filters in a trained network directly affect the quality of the data representation being produced. In this talk, we introduce a framework for decomposing convolutional filters over a truncated expansion under pre-fixed bases, where the expansion coefficients are learned from data. Such a structure not only reduces the number of trainable parameters and computation load but also explicitly imposes filter regularity by bases truncation. Apart from maintaining prediction accuracy across image classification datasets, the decomposed-filter CNN also produces a stable representation with respect to input variations, which is proved under generic assumptions on the basis expansion. Joint work with Qiang Qiu, Robert Calderbank, and Guillermo Sapiro.