For single-view unsupervised feature selection we propose two novel methods RUFS and AUFS. RUFS considers outliers in both labeling learning and feature selection thus is more robust than state-of-the-arts. AUFS is proposed such that three desirable properties are satisfied: (1) Sparsity-inducing property; (2) Large weights and small weights are equally penalized; (3) Good balance between small loss on normal data examples and large loss on outliers. For multi-view unsupervised feature selection we propose to directly utilize raw features in the main view to learn pseudo cluster labels which should also have the most consensus with other views and meanwhile the discriminative features in the feature selection process will win out to contribute more on label learning process. For multi-view topic discovery we propose a regularized nonnegative constrained $l_{21}$-norm minimization framework as a systematic solution that can integrate information propagation and mutual enhancement between data of different types without supervision in a principled way.
Piracy-free
Assured Quality
Secure Transactions
Delivery Options
Please enter pincode to check delivery time.
*COD & Shipping Charges may apply on certain items.