Linear discriminant analysis with worst between-class separation and average within-class compactness  

Linear discriminant analysis with worst between-class separation and average within-class compactness

在线阅读下载全文

作  者:Leilei YANG Songcan CHEN 

机构地区:[1]College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China

出  处:《Frontiers of Computer Science》2014年第5期785-792,共8页中国计算机科学前沿(英文版)

基  金:This work was supported in part by the National Natu- ral Science Foundation of China (Grant No. 61170151 ), the National Natural Science Foundation of Jiangsu Province (BK2011728), and was sponsored by the QingLan Project and the Fund Research Funds for the Central Uni- versifies (NZ2013306).

摘  要:Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) tech- niques and obtains discriminant projections by maximizing the ratio of average-case between-class scatter to average- case within-class scatter. Two recent discriminant analysis algorithms (DAS), minimal distance maximization (MDM) and worst-case LDA (WLDA), get projections by optimiz- ing worst-case scatters. In this paper, we develop a new LDA framework called LDA with worst between-class separation and average within-class compactness (WSAC) by maximiz- ing the ratio of worst-case between-class scatter to average- case within-class scatter. This can be achieved by relaxing the trace ratio optimization to a distance metric learning prob- lem. Comparative experiments demonstrate its effectiveness. In addition, DA counterparts using the local geometry of data and the kernel trick can likewise be embedded into our frame- work and be solved in the same way.Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) tech- niques and obtains discriminant projections by maximizing the ratio of average-case between-class scatter to average- case within-class scatter. Two recent discriminant analysis algorithms (DAS), minimal distance maximization (MDM) and worst-case LDA (WLDA), get projections by optimiz- ing worst-case scatters. In this paper, we develop a new LDA framework called LDA with worst between-class separation and average within-class compactness (WSAC) by maximiz- ing the ratio of worst-case between-class scatter to average- case within-class scatter. This can be achieved by relaxing the trace ratio optimization to a distance metric learning prob- lem. Comparative experiments demonstrate its effectiveness. In addition, DA counterparts using the local geometry of data and the kernel trick can likewise be embedded into our frame- work and be solved in the same way.

关 键 词:dimensionality reduction linear discriminantanalysis the worst separation the average compactness. 

分 类 号:TP391.41[自动化与计算机技术—计算机应用技术] O178[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象