<--- Back to Details
First PageDocument Content
Artificial intelligence / Lossless data compression / Feature selection / Lossy compression / Kullback–Leibler divergence / K-nearest neighbor algorithm / Statistics / Data compression / Machine learning
Date: 2010-07-07 01:06:51
Artificial intelligence
Lossless data compression
Feature selection
Lossy compression
Kullback–Leibler divergence
K-nearest neighbor algorithm
Statistics
Data compression
Machine learning

Add to Reading List

Source URL: dml.utdallas.edu

Download Document from Source Website

File Size: 264,41 KB

Share Document on Facebook

Similar Documents

LASzip: lossless compression of LiDAR data Martin Isenburg LAStools http://laszip.org Abstract—Airborne laser scanning technology (LiDAR) makes

DocID: 1vonl - View Document

ORACLE DATA SHEET ORACLE ADVANCED COMPRESSION KEY FEATURES AND BENEFITS • Reduces database storage

DocID: 1uPQO - View Document

Lossy data compression: nonasymptotic fundamental limits Victoria Kostina A Dissertation

DocID: 1usaZ - View Document

An Introduction to Neural Networks Vincent Cheung Kevin Cannons Signal & Data Compression Laboratory Electrical & Computer Engineering University of Manitoba

DocID: 1umIv - View Document

When Data Compression and Statistics Disagree Two Frequentist Challenges for the Minimum Description Length Principle Tim van Erven

DocID: 1ujeN - View Document