Evaluation Of Cross Language Information Retrieval Systems

Evaluation Of Cross Language Information Retrieval Systems Book in PDF, ePub and Kindle version is available to download in english. Read online anytime anywhere directly from your device. Click on the download button below to get a free pdf file of Evaluation Of Cross Language Information Retrieval Systems book. This book definitely worth reading, it is an incredibly well-written.

Evaluation of Cross-Language Information Retrieval Systems

Author : Martin Braschler,Julio Gonzalo,Michael Kluck
Publisher : Springer
Page : 606 pages
File Size : 46,5 Mb
Release : 2003-08-02
Category : Computers
ISBN : 9783540456919

Get Book

Evaluation of Cross-Language Information Retrieval Systems by Martin Braschler,Julio Gonzalo,Michael Kluck Pdf

The second evaluation campaign of the Cross Language Evaluation Forum (CLEF) for European languages was held from January to September 2001. This campaign proved a great success, and showed an increase in participation of around 70% com pared with CLEF 2000. It culminated in a two day workshop in Darmstadt, Germany, 3–4 September, in conjunction with the 5th European Conference on Digital Libraries (ECDL 2001). On the first day of the workshop, the results of the CLEF 2001 evalua tion campaign were reported and discussed in paper and poster sessions. The second day focused on the current needs of cross language systems and how evaluation cam paigns in the future can best be designed to stimulate progress. The workshop was attended by nearly 50 researchers and system developers from both academia and in dustry. It provided an important opportunity for researchers working in the same area to get together and exchange ideas and experiences. Copies of all the presentations are available on the CLEF web site at http://www. clef campaign. org. This volume con tains thoroughly revised and expanded versions of the papers presented at the work shop and provides an exhaustive record of the CLEF 2001 campaign. CLEF 2001 was conducted as an activity of the DELOS Network of Excellence for Digital Libraries, funded by the EC Information Society Technologies program to further research in digital library technologies. The activity was organized in collabo ration with the US National Institute of Standards and Technology (NIST).

Evaluation of Cross-Language Information Retrieval Systems

Author : Martin Braschler,Julio Gonzalo
Publisher : Unknown
Page : 616 pages
File Size : 46,5 Mb
Release : 2014-01-15
Category : Electronic
ISBN : 3662211386

Get Book

Evaluation of Cross-Language Information Retrieval Systems by Martin Braschler,Julio Gonzalo Pdf

Cross-Language Information Retrieval

Author : Jian-Yun Nie
Publisher : Springer Nature
Page : 125 pages
File Size : 50,7 Mb
Release : 2022-05-31
Category : Computers
ISBN : 9783031021381

Get Book

Cross-Language Information Retrieval by Jian-Yun Nie Pdf

Search for information is no longer exclusively limited within the native language of the user, but is more and more extended to other languages. This gives rise to the problem of cross-language information retrieval (CLIR), whose goal is to find relevant information written in a different language to a query. In addition to the problems of monolingual information retrieval (IR), translation is the key problem in CLIR: one should translate either the query or the documents from a language to another. However, this translation problem is not identical to full-text machine translation (MT): the goal is not to produce a human-readable translation, but a translation suitable for finding relevant documents. Specific translation methods are thus required. The goal of this book is to provide a comprehensive description of the specific problems arising in CLIR, the solutions proposed in this area, as well as the remaining problems. The book starts with a general description of the monolingual IR and CLIR problems. Different classes of approaches to translation are then presented: approaches using an MT system, dictionary-based translation and approaches based on parallel and comparable corpora. In addition, the typical retrieval effectiveness using different approaches is compared. It will be shown that translation approaches specifically designed for CLIR can rival and outperform high-quality MT systems. Finally, the book offers a look into the future that draws a strong parallel between query expansion in monolingual IR and query translation in CLIR, suggesting that many approaches developed in monolingual IR can be adapted to CLIR. The book can be used as an introduction to CLIR. Advanced readers can also find more technical details and discussions about the remaining research challenges in the future. It is suitable to new researchers who intend to carry out research on CLIR. Table of Contents: Preface / Introduction / Using Manually Constructed Translation Systems and Resources for CLIR / Translation Based on Parallel and Comparable Corpora / Other Methods to Improve CLIR / A Look into the Future: Toward a Unified View of Monolingual IR and CLIR? / References / Author Biography

Cross-Language Information Retrieval

Author : Gregory Grefenstette
Publisher : Springer Science & Business Media
Page : 190 pages
File Size : 45,6 Mb
Release : 2012-12-06
Category : Computers
ISBN : 9781461556619

Get Book

Cross-Language Information Retrieval by Gregory Grefenstette Pdf

Most of the papers in this volume were first presented at the Workshop on Cross-Linguistic Information Retrieval that was held August 22, 1996 dur ing the SIGIR'96 Conference. Alan Smeaton of Dublin University and Paraic Sheridan of the ETH, Zurich, were the two other members of the Scientific Committee for this workshop. SIGIR is the Association for Computing Ma chinery (ACM) Special Interest Group on Information Retrieval, and they have held conferences yearly since 1977. Three additional papers have been added: Chapter 4 Distributed Cross-Lingual Information retrieval describes the EMIR retrieval system, one of the first general cross-language systems to be implemented and evaluated; Chapter 6 Mapping Vocabularies Using Latent Semantic Indexing, which originally appeared as a technical report in the Lab oratory for Computational Linguistics at Carnegie Mellon University in 1991, is included here because it was one of the earliest, though hard-to-find, publi cations showing the application of Latent Semantic Indexing to the problem of cross-language retrieval; and Chapter 10 A Weighted Boolean Model for Cross Language Text Retrieval describes a recent approach to solving the translation term weighting problem, specific to Cross-Language Information Retrieval. Gregory Grefenstette CONTRIBUTORS Lisa Ballesteros David Hull W, Bruce Croft Gregory Grefenstette Center for Intelligent Xerox Research Centre Europe Information Retrieval Grenoble Laboratory Computer Science Department University of Massachusetts Thomas K. Landauer Department of Psychology Mark W. Davis and Institute of Cognitive Science Computing Research Lab University of Colorado, Boulder New Mexico State University Michael L. Littman Bonnie J.

Advances in Cross-Language Information Retrieval

Author : Cross-Language Evaluation Forum. Workshop,Carol Peters
Publisher : Springer Science & Business Media
Page : 832 pages
File Size : 48,8 Mb
Release : 2003-10-10
Category : Computers
ISBN : 9783540408307

Get Book

Advances in Cross-Language Information Retrieval by Cross-Language Evaluation Forum. Workshop,Carol Peters Pdf

This book presents the thoroughly refereed post-proceedings of a workshop by the Cross-Language Evaluation Forum Campaign, CLEF 2002, held in Rome, Italy in September 2002. The 43 revised full papers presented together with an introduction and run data in an appendix were carefully reviewed and revised upon presentation at the workshop. The papers are organized in topical sections on systems evaluation experiments, cross language and more, monolingual experiments, mainly domain-specific information retrieval, interactive issues, cross-language spoken document retrieval, and cross-language evaluation issues and initiatives.

Evaluation of Cross-Language Information Retrieval Systems

Author : Martin Braschler,Julio Gonzalo,Michael Kluck
Publisher : Springer
Page : 606 pages
File Size : 41,8 Mb
Release : 2002-08-07
Category : Computers
ISBN : 3540440429

Get Book

Evaluation of Cross-Language Information Retrieval Systems by Martin Braschler,Julio Gonzalo,Michael Kluck Pdf

The second evaluation campaign of the Cross Language Evaluation Forum (CLEF) for European languages was held from January to September 2001. This campaign proved a great success, and showed an increase in participation of around 70% com pared with CLEF 2000. It culminated in a two day workshop in Darmstadt, Germany, 3–4 September, in conjunction with the 5th European Conference on Digital Libraries (ECDL 2001). On the first day of the workshop, the results of the CLEF 2001 evalua tion campaign were reported and discussed in paper and poster sessions. The second day focused on the current needs of cross language systems and how evaluation cam paigns in the future can best be designed to stimulate progress. The workshop was attended by nearly 50 researchers and system developers from both academia and in dustry. It provided an important opportunity for researchers working in the same area to get together and exchange ideas and experiences. Copies of all the presentations are available on the CLEF web site at http://www. clef campaign. org. This volume con tains thoroughly revised and expanded versions of the papers presented at the work shop and provides an exhaustive record of the CLEF 2001 campaign. CLEF 2001 was conducted as an activity of the DELOS Network of Excellence for Digital Libraries, funded by the EC Information Society Technologies program to further research in digital library technologies. The activity was organized in collabo ration with the US National Institute of Standards and Technology (NIST).

Evaluation of Cross-Language Information Retrieval Systems

Author : Martin Braschler,Julio Gonzalo,Michael Kluck
Publisher : Springer
Page : 0 pages
File Size : 41,8 Mb
Release : 2003-08-02
Category : Computers
ISBN : 3540456910

Get Book

Evaluation of Cross-Language Information Retrieval Systems by Martin Braschler,Julio Gonzalo,Michael Kluck Pdf

The second evaluation campaign of the Cross Language Evaluation Forum (CLEF) for European languages was held from January to September 2001. This campaign proved a great success, and showed an increase in participation of around 70% com pared with CLEF 2000. It culminated in a two day workshop in Darmstadt, Germany, 3–4 September, in conjunction with the 5th European Conference on Digital Libraries (ECDL 2001). On the first day of the workshop, the results of the CLEF 2001 evalua tion campaign were reported and discussed in paper and poster sessions. The second day focused on the current needs of cross language systems and how evaluation cam paigns in the future can best be designed to stimulate progress. The workshop was attended by nearly 50 researchers and system developers from both academia and in dustry. It provided an important opportunity for researchers working in the same area to get together and exchange ideas and experiences. Copies of all the presentations are available on the CLEF web site at http://www. clef campaign. org. This volume con tains thoroughly revised and expanded versions of the papers presented at the work shop and provides an exhaustive record of the CLEF 2001 campaign. CLEF 2001 was conducted as an activity of the DELOS Network of Excellence for Digital Libraries, funded by the EC Information Society Technologies program to further research in digital library technologies. The activity was organized in collabo ration with the US National Institute of Standards and Technology (NIST).

Cross-Language Information Retrieval and Evaluation

Author : Carol Peters
Publisher : Springer
Page : 396 pages
File Size : 40,7 Mb
Release : 2003-06-29
Category : Computers
ISBN : 9783540446453

Get Book

Cross-Language Information Retrieval and Evaluation by Carol Peters Pdf

The first evaluation campaign of the Cross-Language Evaluation Forum (CLEF) for European languages was held from January to September 2000. The campaign cul- nated in a two-day workshop in Lisbon, Portugal, 21 22 September, immediately following the fourth European Conference on Digital Libraries (ECDL 2000). The first day of the workshop was open to anyone interested in the area of Cross-Language Information Retrieval (CLIR) and addressed the topic of CLIR system evaluation. The goal was to identify the actual contribution of evaluation to system development and to determine what could be done in the future to stimulate progress. The second day was restricted to participants in the CLEF 2000 evaluation campaign and to their - periments. This volume constitutes the proceedings of the workshop and provides a record of the campaign. CLEF is currently an activity of the DELOS Network of Excellence for Digital - braries, funded by the EC Information Society Technologies to further research in digital library technologies. The activity is organized in collaboration with the US National Institute of Standards and Technology (NIST). The support of DELOS and NIST in the running of the evaluation campaign is gratefully acknowledged. I should also like to thank the other members of the Workshop Steering Committee for their assistance in the organization of this event.

Information Retrieval Evaluation

Author : Donna K. Harman
Publisher : Morgan & Claypool Publishers
Page : 122 pages
File Size : 42,8 Mb
Release : 2011
Category : Computers
ISBN : 9781598299717

Get Book

Information Retrieval Evaluation by Donna K. Harman Pdf

Evaluation has always played a major role in information retrieval, with the early pioneers such as Cyril Cleverdon and Gerard Salton laying the foundations for most of the evaluation methodologies in use today. The retrieval community has been extremely fortunate to have such a well-grounded evaluation paradigm during a period when most of the human language technologies were just developing. This lecture has the goal of explaining where these evaluation methodologies came from and how they have continued to adapt to the vastly changed environment in the search engine world today. The lecture starts with a discussion of the early evaluation of information retrieval systems, starting with the Cranfield testing in the early 1960s, continuing with the Lancaster "user" study for MEDLARS, and presenting the various test collection investigations by the SMART project and by groups in Britain. The emphasis in this chapter is on the how and the why of the various methodologies developed. The second chapter covers the more recent "batch" evaluations, examining the methodologies used in the various open evaluation campaigns such as TREC, NTCIR (emphasis on Asian languages), CLEF (emphasis on European languages), INEX (emphasis on semi-structured data), etc. Here again the focus is on the how and why, and in particular on the evolving of the older evaluation methodologies to handle new information access techniques. This includes how the test collection techniques were modified and how the metrics were changed to better reflect operational environments. The final chapters look at evaluation issues in user studies -- the interactive part of information retrieval, including a look at the search log studies mainly done by the commercial search engines. Here the goal is to show, via case studies, how the high-level issues of experimental design affect the final evaluations. Table of Contents: Introduction and Early History / "Batch" Evaluation Since 1992 / Interactive Evaluation / Conclusion

Evaluation of Multilingual and Multi-modal Information Retrieval

Author : Cross-Language Evaluation Forum. Workshop,Carol Peters
Publisher : Springer Science & Business Media
Page : 1018 pages
File Size : 54,5 Mb
Release : 2007-09-06
Category : Computers
ISBN : 9783540749981

Get Book

Evaluation of Multilingual and Multi-modal Information Retrieval by Cross-Language Evaluation Forum. Workshop,Carol Peters Pdf

This book constitutes the thoroughly refereed postproceedings of the 7th Workshop of the Cross-Language Evaluation Forum, CLEF 2006, held in Alicante, Spain, September 2006. The revised papers presented together with an introduction were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on Multilingual Textual Document Retrieval, Domain-Specifig Information Retrieval, i-CLEF, QA@CLEF, ImageCLEF, CLSR, WebCLEF and GeoCLEF.

Accessing Multilingual Information Repositories

Author : Fredric Gey,Julio Gonzalo,Henning Mueller,Gareh Jones,Michael Kluck,Bernardo Magnini,Maarten de Rijke
Publisher : Springer
Page : 1013 pages
File Size : 43,5 Mb
Release : 2006-10-15
Category : Computers
ISBN : 9783540457008

Get Book

Accessing Multilingual Information Repositories by Fredric Gey,Julio Gonzalo,Henning Mueller,Gareh Jones,Michael Kluck,Bernardo Magnini,Maarten de Rijke Pdf

This book constitutes the thoroughly refereed postproceedings of the 6th Workshop of the Cross-Language Evaluation Forum, CLEF 2005. The book presents 111 revised papers together with an introduction. Topical sections include multilingual textual document retrieval, cross-language and more, monolingual experiments, domain-specific information retrieval, interactive cross-language information retrieval, multiple language question answering, cross-language retrieval in image collections, cross-language speech retrieval, multilingual Web track, cross-language geographical retrieval, and evaluation issues.

Advances in Multilingual and Multimodal Information Retrieval

Author : Valentin Jijkoun,Thomas Mandl,Henning Müller,Douglas W. Oard,Vivien Petras,Diana Santos
Publisher : Springer
Page : 922 pages
File Size : 40,5 Mb
Release : 2008-09-26
Category : Computers
ISBN : 9783540857600

Get Book

Advances in Multilingual and Multimodal Information Retrieval by Valentin Jijkoun,Thomas Mandl,Henning Müller,Douglas W. Oard,Vivien Petras,Diana Santos Pdf

The eighth campaign of the Cross Language Evaluation Forum (CLEF) for - ropean languages was held from January to September 2007. There were seven distinct evaluation tracks in CLEF 2007, designed to test the performance of a wide range of multilingual information access systems or system components. CLEF is by now an established international evaluation initiative and, in 2007, 81 groups from all over the world submitted results for one or more of the di?erent evaluation tracks. Full details regarding the design of the tracks, the methodologies used for evaluation, and the results obtained by the participants can be found in the di?erent sections of these proceedings. As always the results of the campaign were reported and discussed at the annual workshop, held in Budapest, Hungary, 19-21 September, immediately following the eleventh European Conference on Digital Libraries. The workshop playsanimportantrolebyprovidingtheopportunityforallthe groupsthathave participated in the evaluation campaign to get together to compare approaches and exchange ideas.

Multilingual Information Access for Text, Speech and Images

Author : Paul Clough,Julio Gonzalo,Michael Kluck,Bernardo Magnini
Publisher : Springer
Page : 860 pages
File Size : 54,8 Mb
Release : 2005-08-25
Category : Computers
ISBN : 9783540320517

Get Book

Multilingual Information Access for Text, Speech and Images by Paul Clough,Julio Gonzalo,Michael Kluck,Bernardo Magnini Pdf

The ?fth campaign of the Cross-Language Evaluation Forum (CLEF) for Eu- pean languages was held from January to September 2004. Participation in the CLEF campaigns has increased each year and CLEF 2004 was no exception: 55 groups submitted results for one or more of the di?erent tracks compared with 42 groups in the previous year. CLEF 2004 also marked a breaking point with respect to previous campaigns. The focus was no longer mainly concentrated on multilingual document retrieval as in previous years but was diversi?ed to include di?erent kinds of text retrieval across languages (e. g. , exact answers in the question-answering track) and retrieval on di?erent kinds of media (i. e. , not just plain text but collections containing image and speech as well). In ad- tion, increasing attention was given to issues that regard system usability and user satisfaction with tasks to measure the e?ectiveness of interactive systems or system components being included in both the cross-language question - swering and image retrieval tasks with the collaboration of the coordinators of the interactive track. The campaign culminated in a two-and-a-half-day workshop held in Bath, UK, 15–17 September, immediately following the 8th European Conference on Digital Libraries. The workshop was attended by nearly 100 researchers and s- tem developers.

Comparative Evaluation of Multilingual Information Access Systems

Author : Julio Gonzalo,Martin Braschler,Michael Kluck
Publisher : Springer
Page : 713 pages
File Size : 52,5 Mb
Release : 2004-11-16
Category : Language Arts & Disciplines
ISBN : 9783540302223

Get Book

Comparative Evaluation of Multilingual Information Access Systems by Julio Gonzalo,Martin Braschler,Michael Kluck Pdf

The fourth campaign of the Cross-language Evaluation Forum (CLEF) for European languages was held from January to August 2003. Participation in this campaign showed a slight rise in the number of participants from the previous year, with 42 groups submitting results for one or more of the different tracks (compared with 37 in 2002), but a steep rise in the number of experiments attempted. A distinctive feature of CLEF 2003 was the number of new tracks and tasks that were offered as pilot experiments. The aim was to try out new ideas and to encourage the development of new evaluation methodologies, suited to the emerging requirements of both system developers and users with respect to today’s digital collections and to encourage work on many European languages rather than just those most widely used. CLEF is thus gradually pushing its participants towards the ultimate goal: the development of truly multilingual systems capable of processing collections in diverse media. The campaign culminated in a two-day workshop held in Trondheim, Norway, 21–22 August, immediately following the 7th European Conference on Digital Libraries (ECDL 2003), and attended by more than 70 researchers and system developers. The objective of the workshop was to bring together the groups that had participated in the CLEF 2003 campaign so that they could report on the results of their experiments.

Evaluating Systems for Multilingual and Multimodal Information Access

Author : Thomas Deselaers,Nicola Ferro,Julio Gonzalo,Mikko Kurimo,Thomas Mandl,Vivien Petras
Publisher : Springer
Page : 1002 pages
File Size : 47,8 Mb
Release : 2009-09-29
Category : Computers
ISBN : 9783642044472

Get Book

Evaluating Systems for Multilingual and Multimodal Information Access by Thomas Deselaers,Nicola Ferro,Julio Gonzalo,Mikko Kurimo,Thomas Mandl,Vivien Petras Pdf

The ninth campaign of the Cross-Language Evaluation Forum (CLEF) for European languages was held from January to September 2008. There were seven main eval- tion tracks in CLEF 2008 plus two pilot tasks. The aim, as usual, was to test the p- formance of a wide range of multilingual information access (MLIA) systems or s- tem components. This year, 100 groups, mainly but not only from academia, parti- pated in the campaign. Most of the groups were from Europe but there was also a good contingent from North America and Asia plus a few participants from South America and Africa. Full details regarding the design of the tracks, the methodologies used for evaluation, and the results obtained by the participants can be found in the different sections of these proceedings. The results of the CLEF 2008 campaign were presented at a two-and-a-half day workshop held in Aarhus, Denmark, September 17–19, and attended by 150 resear- ers and system developers. The annual workshop, held in conjunction with the European Conference on Digital Libraries, plays an important role by providing the opportunity for all the groups that have participated in the evaluation campaign to get together comparing approaches and exchanging ideas. The schedule of the workshop was divided between plenary track overviews, and parallel, poster and breakout sessions presenting this year’s experiments and discu- ing ideas for the future. There were several invited talks.