<--- Back to Details
First PageDocument Content
Information theory / Data analysis / Statistical theory / Statistical models / Independent component analysis / Principle of maximum entropy / Kullback–Leibler divergence / Mutual information / Normal distribution / Statistics / Probability and statistics / Multivariate statistics
Date: 2007-05-31 18:03:39
Information theory
Data analysis
Statistical theory
Statistical models
Independent component analysis
Principle of maximum entropy
Kullback–Leibler divergence
Mutual information
Normal distribution
Statistics
Probability and statistics
Multivariate statistics

Add to Reading List

Source URL: mplab.ucsd.edu

Download Document from Source Website

File Size: 891,39 KB

Share Document on Facebook

Similar Documents

Statistical classification / Statistics / Probability and statistics / Mathematics / Support vector machine / Predictive modelling / K-nearest neighbors algorithm / Mathematical model / Data analysis / Spatial analysis / Regression analysis / Artificial neural network

Visualizing statistical models: Removing the blindfold Hadley Wickham, Dianne Cook and Heike Hofmann Department of Statistics MSMain St Houston TXe-mail:

DocID: 1xVjw - View Document

Estimation theory / Econometrics / Statistical inference / Estimator / Probability distribution fitting / M-estimators / Maximum likelihood estimation / Fisher information / Gamma distribution / Maximum spacing estimation

Noise-contrastive estimation: A new estimation principle for unnormalized statistical models Michael Gutmann Dept of Computer Science and HIIT, University of Helsinki

DocID: 1xUlB - View Document

Code Completion with Statistical Language Models Veselin Raychev Martin Vechev Eran Yahav

DocID: 1xToM - View Document

An Introduction to the Statistical Analysis of Agent-Based Models Giorgio Fagiolo https://mail.sssup.it/~fagiolo

DocID: 1vhKi - View Document

FallSTA4513: Statistical Models of Networks Lecture 3 — 24 September, 2014 Prof. Daniel M. Roy

DocID: 1vfwF - View Document