Online Evaluation for Information Retrieval

Online Evaluation for Information Retrieval
Title Online Evaluation for Information Retrieval PDF eBook
Author Katja Hofmann
Publisher
Pages 134
Release 2016-06-07
Genre Computers
ISBN 9781680831634

Download Online Evaluation for Information Retrieval Book in PDF, Epub and Kindle

Provides a comprehensive overview of the topic. It shows how online evaluation is used for controlled experiments, segmenting them into experiment designs that allow absolute or relative quality assessments. It also includes an extensive discussion of recent work on data re-use, and experiment estimation based on historical data.

Introduction to Information Retrieval

Introduction to Information Retrieval
Title Introduction to Information Retrieval PDF eBook
Author Christopher D. Manning
Publisher Cambridge University Press
Pages
Release 2008-07-07
Genre Computers
ISBN 1139472100

Download Introduction to Information Retrieval Book in PDF, Epub and Kindle

Class-tested and coherent, this textbook teaches classical and web information retrieval, including web search and the related areas of text classification and text clustering from basic concepts. It gives an up-to-date treatment of all aspects of the design and implementation of systems for gathering, indexing, and searching documents; methods for evaluating systems; and an introduction to the use of machine learning methods on text collections. All the important ideas are explained using examples and figures, making it perfect for introductory courses in information retrieval for advanced undergraduates and graduate students in computer science. Based on feedback from extensive classroom experience, the book has been carefully structured in order to make teaching more natural and effective. Slides and additional exercises (with solutions for lecturers) are also available through the book's supporting website to help course instructors prepare their lectures.

Online Evaluation for Information Retrieval

Online Evaluation for Information Retrieval
Title Online Evaluation for Information Retrieval PDF eBook
Author Katja Hofmann
Publisher
Pages 117
Release 2016
Genre Information retrieval
ISBN 9781680831627

Download Online Evaluation for Information Retrieval Book in PDF, Epub and Kindle

Online evaluation is one of the most common approaches to measure the effectiveness of an information retrieval system. It involves fielding the information retrieval system to real users, and observing these users' interactions in-situ while they engage with the system. This allows actual users with real world information needs to play an important part in assessing retrieval quality. As such, online evaluation complements the common alternative offline evaluation approaches which may provide more easily interpretable outcomes, yet are often less realistic when measuring of quality and actual user experience. In this survey, we provide an overview of online evaluation techniques for information retrieval. We show how online evaluation is used for controlled experiments, segmenting them into experiment designs that allow absolute or relative quality assessments. Our presentation of different metrics further partitions online evaluation based on different sized experimental units commonly of interest: documents, lists and sessions. Additionally, we include an extensive discussion of recent work on data re-use, and experiment estimation based on historical data. A substantial part of this work focuses on practical issues: How to run evaluations in practice, how to select experimental parameters, how to take into account ethical considerations inherent in online evaluations, and limitations. While most published work on online experimentation today is at large scale in systems with millions of users, we also emphasize that the same techniques can be applied at small scale. To this end, we emphasize recent work that makes it easier to use at smaller scales and encourage studying real-world information seeking in a wide range of scenarios. Finally, we present a summary of the most recent work in the area, and describe open problems, as well as postulating future directions.

Experiment and Evaluation in Information Retrieval Models

Experiment and Evaluation in Information Retrieval Models
Title Experiment and Evaluation in Information Retrieval Models PDF eBook
Author K. Latha
Publisher CRC Press
Pages 282
Release 2017-07-28
Genre Computers
ISBN 1315392615

Download Experiment and Evaluation in Information Retrieval Models Book in PDF, Epub and Kindle

Experiment and Evaluation in Information Retrieval Models explores different algorithms for the application of evolutionary computation to the field of information retrieval (IR). As well as examining existing approaches to resolving some of the problems in this field, results obtained by researchers are critically evaluated in order to give readers a clear view of the topic. In addition, this book covers Algorithmic Solutions to the Problems in Advanced IR Concepts, including Feature Selection for Document Ranking, web page classification and recommendation, Facet Generation for Document Retrieval, Duplication Detection and seeker satisfaction in question answering community Portals. Written with students and researchers in the field on information retrieval in mind, this book is also a useful tool for researchers in the natural and social sciences interested in the latest developments in the fast-moving subject area. Key features: Focusing on recent topics in Information Retrieval research, Experiment and Evaluation in Information Retrieval Models explores the following topics in detail: Searching in social media Using semantic annotations Ranking documents based on Facets Evaluating IR systems offline and online The role of evolutionary computation in IR Document and term clustering, Image retrieval Design of user profiles for IR Web page classification and recommendation Relevance feedback approach for Document and image retrieval

Methods for Evaluating Interactive Information Retrieval Systems with Users

Methods for Evaluating Interactive Information Retrieval Systems with Users
Title Methods for Evaluating Interactive Information Retrieval Systems with Users PDF eBook
Author Diane Kelly
Publisher Now Publishers Inc
Pages 246
Release 2009
Genre Database management
ISBN 1601982240

Download Methods for Evaluating Interactive Information Retrieval Systems with Users Book in PDF, Epub and Kindle

Provides an overview and instruction on the evaluation of interactive information retrieval systems with users.

Information Retrieval

Information Retrieval
Title Information Retrieval PDF eBook
Author Stefan Buttcher
Publisher MIT Press
Pages 633
Release 2016-02-12
Genre Computers
ISBN 0262528878

Download Information Retrieval Book in PDF, Epub and Kindle

An introduction to information retrieval, the foundation for modern search engines, that emphasizes implementation and experimentation. Information retrieval is the foundation for modern search engines. This textbook offers an introduction to the core topics underlying modern search technologies, including algorithms, data structures, indexing, retrieval, and evaluation. The emphasis is on implementation and experimentation; each chapter includes exercises and suggestions for student projects. Wumpus—a multiuser open-source information retrieval system developed by one of the authors and available online—provides model implementations and a basis for student work. The modular structure of the book allows instructors to use it in a variety of graduate-level courses, including courses taught from a database systems perspective, traditional information retrieval courses with a focus on IR theory, and courses covering the basics of Web retrieval. In addition to its classroom use, Information Retrieval will be a valuable reference for professionals in computer science, computer engineering, and software engineering.

Evaluating Information Retrieval and Access Tasks

Evaluating Information Retrieval and Access Tasks
Title Evaluating Information Retrieval and Access Tasks PDF eBook
Author Tetsuya Sakai
Publisher Springer Nature
Pages 225
Release 1901
Genre Electronic books
ISBN 9811555540

Download Evaluating Information Retrieval and Access Tasks Book in PDF, Epub and Kindle

This open access book summarizes the first two decades of the NII Testbeds and Community for Information access Research (NTCIR). NTCIR is a series of evaluation forums run by a global team of researchers and hosted by the National Institute of Informatics (NII), Japan. The book is unique in that it discusses not just what was done at NTCIR, but also how it was done and the impact it has achieved. For example, in some chapters the reader sees the early seeds of what eventually grew to be the search engines that provide access to content on the World Wide Web, todays smartphones that can tailor what they show to the needs of their owners, and the smart speakers that enrich our lives at home and on the move. We also get glimpses into how new search engines can be built for mathematical formulae, or for the digital record of a lived human life. Key to the success of the NTCIR endeavor was early recognition that information access research is an empirical discipline and that evaluation therefore lay at the core of the enterprise. Evaluation is thus at the heart of each chapter in this book. They show, for example, how the recognition that some documents are more important than others has shaped thinking about evaluation design. The thirty-three contributors to this volume speak for the many hundreds of researchers from dozens of countries around the world who together shaped NTCIR as organizers and participants. This book is suitable for researchers, practitioners, and students--anyone who wants to learn about past and present evaluation efforts in information retrieval, information access, and natural language processing, as well as those who want to participate in an evaluation task or even to design and organize one.