Advances in Neural Networks

Advances in Neural Networks

4.11 - 1251 ratings - Source

Supposing that pi(yi) is the marginal probability density function (pdf) of the i-th component of y = Wx = WAs, and p(y) is the joint pdf of y, we can use the Kullback divergence to set up the following Minimum Mutual Information (MMI) criterion [1]: I(y) = aˆl p(y) log p(y)aˆni=1pi(yi)dy . ... function and proved that the global maximum of the cost function correspond to a feasible solution of the ICA problem [6].

Title:Advances in Neural Networks
Author: Fuchun Sun, Jianwei Zhang, Jinde Cao, Wen Yu
Publisher:Springer - 2008-09-08

You must register with us as either a Registered User before you can Download this Book. You'll be greeted by a simple sign-up page.

Once you have finished the sign-up process, you will be redirected to your download Book page.

How it works:
  • 1. Register a free 1 month Trial Account.
  • 2. Download as many books as you like (Personal use)
  • 3. Cancel the membership at any time if not satisfied.

Click button below to register and download Ebook
Privacy Policy | Contact | DMCA