Wikipedia:Articles for deletion/Cross Language Evaluation Forum


 * The following discussion is an archived debate of the proposed deletion of the article below. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review).  No further edits should be made to this page.  

The result was Delete. No evidence has been provided to counter the lack of notability claims made by those advocating deletion. If it is indeed notable the article can be recreated with appropriate references. - Yomangani talk 11:05, 14 October 2006 (UTC)

Cross Language Evaluation Forum


Non-notable event. Contested prod. MER-C 02:44, 4 October 2006 (UTC)
 * Comment Looking over the google hits, I'm not so certain about the notability of this event. Without a guideline at hand, I'm tending to lean in the keep direction. Mitaphane talk 12:42, 6 October 2006 (UTC)
 * Obviously the Cross-Language Evaluation Forum is not as notable as FIFA World Cup but within the information retrieval comunity, specially within the European Union this is a really notable event. I wonder if Mitaphane is an expert in Information Retrieval or Natural Language Processing. I vote for the non deletion.
 * Comment: I'm not. That's why I've called question to this nomination. We need someone with familarity with the subject to make that call. —Mitaphane talk 05:21, 7 October 2006 (UTC)


 *  AFD relisted to generate a more thorough discussion so that consensus may be reached.  Please add new discussions below this notice. Thanks, Keitei (talk) 09:18, 10 October 2006 (UTC)


 * delete, non-notable, and we're not a directory of press-releases. yandman  09:24, 10 October 2006 (UTC)


 * Keep: CLEF has been a major driving force behind much cross-language research, in Europe and beyond. Increasingly, it is also the platform of choice for investigating innovative retrieval scenarios, proposed by research comunity and/or industry. It has given rise to hundreds of scientific publications, and is an excellent venue for new research groups and young researchers to get into the field. Finally, more and more IR and NLP research in Europe needs to be supplemented with explicit task-based evaluation efforts---CLEF provides an evaluation platform for a broad variety of information access tasks, and because of that it (and the many retrieval test sets that it has produced since the start of the century) has become an essential building block in the European research landscape. (As a side issue: CLEF is a collaborative effort, very much organized in a bottom-up fashion, by the research community.  The knowledge and resource sharing that goes on in and around CLEF is in the very same spirit as what drives people to cotrnibute to Wikipedia.) Mderijke 13:33, 10 October 2006 (UTC) — Possible single purpose account: Mderijke (talk • contribs)  has made few or no other contributions outside this topic.
 * Weak Delete - While this group may be notable within a select community, the article doesn't really help explain that. If we could get some verifiable, independent sources explaining how this group has done some notable work, then I would say keep.  --cholmes75 (chit chat) 14:39, 10 October 2006 (UTC)
 * Keep - CLEF is a very important evaluation campaign for Information Retrieval systems. It has indeed done some notable work by providing a common setting for researchers to perform experiments and thus evaluate their systems. Every year, dozens of participating groups from several countries take part in the event.Vivianeorengo 15:00, 10 October 2006 (UTC) — Possible single purpose account: Vivianeorengo (talk • contribs) has made few or no other contributions outside this topic.
 * Keep! CLEF has a seven year history and attracts a large and growing number of research groups from around the world. CLEF is the most thoroughly multilingual evaluation workshop in empirical IR/NLP/Computational Linguistics - newspaper IR test sets exist in 12 languages, web IR test sets exist in 25 languages. CLEF is certainly every bit as significant in Europe as TREC is in the U.S. CLEF pioneered evaluation of technologies such as cross-language audio retrieval and multilingual image search that have not been addressed in any other evaluation campaign.--128.244.247.203 16:06, 10 October 2006 (UTC) — Possible single purpose account: 128.244.247.203 (talk • contribs) has made few or no other contributions outside this topic.
 * Keep! CLEF is very important for multilingual information retrieval, especially those which do not involve English directly. The page should be updated, thogh — Possible single purpose account: 82.37.18.136 (talk • contribs) has made few or no other contributions outside this topic.
 * Keep CLEF is s very important event within the Information Retrieval community and is the main forum for evaluating and discussing multilingual research in Europe. If Wikipedia aims to provide comprehensive coverage of its content then a reference to CLEF should be maintained. — Possible single purpose account: 86.129.26.92 (talk • contribs) has made few or no other contributions outside this topic.
 * Keep; makes cited claims of notability. Vectro 04:01, 11 October 2006 (UTC) (not a sockpuppet account; see my change history.)
 * Comment The article does not have cites; all of its links are in the "external links" section, which are not supposed to be references in accordance with WP:CITE. I'd like to see at least one article that's specifically about CLEF; none of the external links point to what CLEF is except the official site.  ColourBurst 19:31, 11 October 2006 (UTC)


 * Delete - Doesn't seem to meet WP:V as the links don't meet WP:RS. Wickethewok 21:06, 11 October 2006 (UTC)
 * Keep; CLEF is indeed notable within the IR community in Europe (disclaimer: I'm involved with it). --Explendido Rocha 18:05, 12 October 2006 (UTC)
 * The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.