<--- Back to Details
First PageDocument Content
Information science / Information retrieval / Information retrieval evaluation / Internet search / Natural language processing / Query expansion / Web search query / Precision and recall / Discounted cumulative gain / Ranking / Query likelihood model / Web query classification
Date: 2016-01-14 06:40:31
Information science
Information retrieval
Information retrieval evaluation
Internet search
Natural language processing
Query expansion
Web search query
Precision and recall
Discounted cumulative gain
Ranking
Query likelihood model
Web query classification

Microsoft Word - sigir14_sp_terms_final.docx

Add to Reading List

Source URL: people.cs.umass.edu

Download Document from Source Website

File Size: 335,59 KB

Share Document on Facebook

Similar Documents

The Domain-Speci c Task of CLEF - Speci c Evaluation Strategies in Cross-Language Information Retrieval Michael Kluck1 and Fredric C. Gey2 1

DocID: 1ufSZ - View Document

Information Retrieval System Evaluation October 3, Module name: Information Retrieval System Evaluation 2. Scope: The module introduces the evaluation in information retrieval. It focuses on the standard measurem

DocID: 1tTxf - View Document

Modern Information Retrieval Chapter 4 Retrieval Evaluation The Cranfield Paradigm Retrieval Performance Evaluation

DocID: 1tazG - View Document

TRDDC @ FIRE 2013 System for Classification of Legal Propositions Forum for Information Retrieval Evaluation December, 2013 Nitin Ramrakhiyani

DocID: 1rybW - View Document

Information science / Information retrieval / Information retrieval evaluation / Recommender system / Evaluation measures / Precision and recall / Relevance / Ranking / Discounted cumulative gain

The Fifth International Workshop on Evaluating Information Access (EVIA), June 18, 2013, Tokyo, Japan Evaluating Contextual Suggestion Adriel Dean-Hall Waterloo

DocID: 1rtTX - View Document