Information Theoretic Learning
shared
This Book is Out of Stock!

About The Book

This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eldthis book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme:learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyisde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies we discovered that the main idea enabling a synergistic view as well as algorithmic implementations does not involve the conventional central moments of the data (mean and covariance). Rather the core concept is the ?-norm of the PDF in part- ular its expected value (? = 2) which we call the information potential. This operator and related nonparametric estimators link information theory optimization of adaptive systems and reproducing kernel Hilbert spaces in a simple and unconventional way.
Piracy-free
Piracy-free
Assured Quality
Assured Quality
Secure Transactions
Secure Transactions
*COD & Shipping Charges may apply on certain items.
Review final details at checkout.
17792
Out Of Stock
All inclusive*
downArrow

Details


LOOKING TO PLACE A BULK ORDER?CLICK HERE