User:Credibility bot

Credibility bot monitors and collects data on source usage within Wikipedia articles. It generates automated, well-designed reports and alerts. Currently it supports a single project, Vaccine Safety.

This bot is under development. We are currently gauging support for additional development work. Our vision is for any WikiProject to customize their own alerts and reports.

The original development of this bot was funded by Hacks/Hackers (Read the Blog).

It is one component of the emerging Credibility Framework for tracking citations and misinformation.

Credibility Bot is currently under the approval process at Bots/Requests for approval/Credibility bot.

Vision
While we currently support one project, we would like to make this functionality available for any project. We intend to build it as a wiki- and language-agnostic system. This is done by:
 * You fill out a form to start a new project
 * You define the parameters of that project, based on whatever criteria you want, without needing to create categories or tag articles
 * A list of relevant pages is created and automatically updated over time
 * Reports about the sources used on those articles are generated
 * Known outstanding article issues are listed within the project, with updates
 * A concern or task raised in one corner of Wikipedia is relayed to relevant, interested people and doesn’t just sit in its one corner
 * The bots are built on top of open APIs that can be used by other bots as well


 * Not building in to the script assumptions around English Wikipedia
 * Using translatable message strings in scripts instead of hard-coded English copy
 * Building templates to accept localized template parameters in addition to English

Design (update)
At Wikiconference North America in Toronto, we held our first user design session. We asked a group of 25 editors 8 questions. Here's a summary of their feedback:


 * Citation monitoring is currently done with blunt white/black lists, or checking someone's private notepad or 'text highlighter' of good and bad sources.
 * CREDBOT would show where flagged sources are located and offer the ability to improve them, helping across a subject area to learn about new sources both good and bad. Having source monitoring would encourage rating more sources, with more subtlety.
 * CREDBOT is trying to save time and raise reliability, put more automatic eyeballs on peer review processes, help editors understand the distribution of cited domains, aggregate stats across a category, and detect poor sources more quickly.
 * At a macro level, we could create master monitoring capacity by identifying universally good or bad sources from those that are only contextually reliable.
 * If this feature was available to individual editors, it could be combined with the Watchlist, customized and personalized.
 * Expanded to Wikidata, we could track citations to urls in statements, which is sorely lacking. Further, common reliability measures themselves should be Wikidata properties.
 * From a defensive perspective, citation alerts could also enable bad actors or trolls to see when their preferred or most hated work is cited. That should be overwhelmed by good actors, as usually happens on Wikipedia, but it's a concern to note.

Request access
Sign up below if you are interested in having this bot run for your WikiProject. You will be informed when capacity becomes available. To expedite the process, define a set of articles (based on categories, Wikidata queries, etc.) that the report should be based on.


 * 1) WikiProject Women's Health, based on the list of articles under its scope here. — Netha Hussain (talk • contribs)
 * 2) WikiProject Climate change would hugely benefit from organization and this type of support —  MaryMO (AR) (talk)
 * 3) WikiProject Skepticism might be interested and might have valuable information about credible sources — MaryMO (AR) (talk)
 * 4) MDWiki.org would love to have this running. Would help us trim predatory publishers and other poor quality sources. Doc James  (talk · contribs · email) 17:03, 19 July 2023 (UTC)
 * 5) WikiProject Writing would greatly benefit from this tool. It has been difficult to create accurate article lists that are aligned with our priorities through the use of categories and Wikidata queries (See documentation here under "Very few composition/rhetoric scholars exist on Wikidata"). I hope this provides some context into what our community needs. Breadyornot (talk) 16:49, 31 July 2023 (UTC)
 * 6) WikiProject Protista (if not all of WikiProject Tree of Life) is absolutely interested in this tool, as there are many, MANY articles under our scope that are very poorly or questionably referenced. It would be great not to review them manually. The scope is here. Snoteleks (talk) 20:49, 5 August 2023 (UTC)
 * 7) WikiProject Trains; I'd like to use this to review and improve sources used for North American locomotive articles. There are about one thousand articles within Category:Standard gauge locomotives of North America and there should be significant overlap in the sourcing. If you wanted to crack all Trains articles that's close to 140,000 covering a wide variety of topics. Mackensen (talk) 19:03, 5 August 2023 (UTC)
 * 8) WikiProject Anarchism is open to trying this. I've added my own concerns below.  czar  10:46, 7 August 2023 (UTC)
 * 9) WikiProject Indian politics: To review and improve sources used by articles that come in the ambit of the project. -MPGuy2824 (talk) 05:52, 12 August 2023 (UTC)
 * 10) WikiProject Ukraine, where there is a lot of disinformation around the edges. -Mzajac
 * 11) WikiProject Weather; might be useful to sort out what sources are or are not credible. Project has a large amount of dead links being discovered as well as several self-published sources. The Weather Event Writer (Talk Page) 06:21, 3 September 2023 (UTC)
 * 12) Tambayan Philippines; would be useful to track articles that may be citing social media sources. We're also thinking of using it specifically on Top-importance and High-importance articles. Ganmatthew (talk • contribs) 18:39, 4 September 2023 (UTC)

Show support
Support the future of the project by commenting on how it would be useful to you, your work, subjects you care about, on-wiki processes, reliability, fact-checking, disinformation, task management, or any other positive aspect you anticipate. If you endorse, please feel free to give a reason, and also your relevant affiliations.


 * 1) I would love to have this to use at the WikiProject level but it would also be great to be able to aggregate information within a particular topic area that is more bounded. I would love to be able to easily find all the articles that relate to a particular area of scientific disinformation or a climate-change-related topic, like renewable energy sources, and then see what needs work or is under-sourced or out of date.  I spent a lot of my time looking at articles and going THIS IS OUT OF DATE. FEATURE REQUEST: I would love a tool to assess the recency of the sources cited (in terms of when they were written, not when they were added) to see how up-to-date it is. MaryMO (AR) (talk) 20:56, 18 July 2023 (UTC)
 * 2) Sources are critical for the reliability of this project. Having tools rapidly flag poorly sourced material will help us maintain accuracy and reliability within our movement. Doc James  (talk · contribs · email) 17:05, 19 July 2023 (UTC)
 * 3) With literally millions of articles to maintain on Wikipedia and a flood of pseudoscientific misinformation on the internet constantly trying to make its way into Wikipedia articles, it's vital that we develop processes and tools like this bot to maximize the usefulness of volunteers dealing with this flood.  Gamaliel  ( talk ) 17:14, 19 July 2023 (UTC)
 * 4) I'd love to be able to create a credbot subpage for any wikiproject that's hosting an editathon, as a way of visualizing changes to citation distribution across the covered articles (and fast feedback for people who clean up citations across the board). Sj
 * 5) I support this and I'd like it to be available for non English projects as well. Naval Scene (talk) 00:37, 21 July 2023 (UTC)
 * 6) The credibility bot is a fantastic tool, and I'm excited to see how it can be used across WikiProjects and topic areas—particularly areas prone to misinformation and poor sourcing. On top of helping editors monitor and improve sourcing, I'd be interested in seeing how the aggregate information about sources can be used for analysis, research, and other tooling. ~ Super  Hamster  Talk Contribs 16:22, 21 July 2023 (UTC)
 * 7) This sounds pretty useful. I am thinking if there are plans to introduce it to small wikis as well? -Satdeep Gill (talk • contribs 02:24, 25 July 2023 (UTC)
 * 8) Seems like a very positive move, both for areas of high controversy, and the less-well-watched backwaters of Wikipedia. I can think of several external organisations that I have worked or liaised with who would be reassured by the proposed reports.  Andy Mabbett ( Pigsonthewing ); Talk to Andy; Andy's edits 16:19, 25 July 2023 (UTC)
 * 9) I love the concept and where this tool is going. A source check/flag is critical for continued improvement of the reliability of the project.  I must echo Mary above: I think a feature that could access the recency of the sources (in terms of when the sources were written) would be really helpful. This would also coincide with data checks of when the data was collected. I come across outdated data regularly, particularly related to statistics related to food and nutrition. Happy to help test if I can, or offer assistance in other ways. Looking forward to utilizing this tool! (I'm not sure how likely language localization will be, but if I can offer some help, I'd be happy to do that, too.) JamieF (talk) 18:10, 25 July 2023 (UTC)
 * 10) This looks like something that could be very powerful, useful, and insightful. Kudos! Extra kudos for making the code language-agnostic. -- I'm curious if anything could be done/planned to reduce the duplication between pages like Vaccine safety/Perennial sources and Reliable sources/Perennial sources (which will proliferate when this tool becomes widely used), but that might be unavoidable in order to keep the latter manageable. Perhaps just list these new lists in the latter's #Topic-specific pages section and encourage WikiProjects to share a list when possible? and perhaps this project will eventually lead the way to another evolution of that older index! Quiddity (talk) 20:54, 25 July 2023 (UTC)
 * 11) Obviously useful tool for editors to monitor source quality, especially in light of the overall lack of officially maintained infrastructure and tools to make sourcing and reference management more efficient and usable. —DarTar (talk) 11:40, 30 July 2023 (UTC)
 * 12) I am thrilled to see a tool like this being developed. Harej has been working closely with WikiProject organizers to assess an ever-growing need for organized article tasks and priorities. I think this is one critical step to increasing the coverage and relevance of articles in need of edits and creation, as well as those that will make the biggest impact toward knowledge equity goals. Breadyornot (talk) 16:34, 31 July 2023 (UTC)
 * 13) The first time I saw this tool I was in awe of its potential. Definitely a great tool for Wikiprojects to identify the quality of references used on articles tagged under them and in future could help small and non-english language Wikipedias improve the quality of referencing.Flixtey (talk) 06:48, 1 August 2023 (UTC)
 * 14) An important tool to help us address the issue of using reliable sources and combat misinformation, disinformation and fake news that are heavily available online. Any project that focuses on helping our community of volunteers be more efficient, and strategic in its work, is a blessed one in one book. Looking forward to seeing future developments. I'm particularly curious if this could be also tied somehow to the Programs & Events dashboard, helping to track EDUWiki work around the world. AFAIK, this is the largest outreach effort we have, and is responsible to bringing many new editors into the movement (10% globally, 19% in EnWiki specifically), so worth the effort. Best, Esh77 20:16, 2 August 2023 (UTC)
 * 15) I think this would be really helpful - providing tools for editors to monitor what needs their attention on Wikipedia is increasingly important given the breadth of content and edits being made. Sam Walton (talk) 15:50, 3 August 2023 (UTC)
 * 16) I am heads over heels with the idea of a bot to help us easily review the reliability of references. Not just for controversial articles but for the ones that get almost 0 attention. I would love to see it work. Snoteleks (talk) 20:52, 5 August 2023 (UTC)
 * 17) Any kind of aggregated reporting on cross-article source usage would be helpful. Mackensen (talk) 19:04, 5 August 2023 (UTC)
 * 18) looking forward to using this in the future. We really need to gauge the reliability of sources in our articles. -- Lenticel  ( talk ) 01:37, 7 August 2023 (UTC)
 * 19) It would be amazing to see this up and running! In addition to improving source reliability generally, I can see this helping editors find articles where greater breadth of sources may be needed to support knowledge equity goals. We would definitely use this on WikiProject Writing. Drkill (talk) 16:42, 7 August 2023 (UTC)
 * 20) I think this will be a great tool to help editors sort through and improve the sources being used on Wikipedia. It should be useful for any WikiProject. WWB (talk) 13:04, 14 August 2023 (UTC)
 * 21) Endorsing, especially in the scope of my work on WikiProject Climate Change -- makes a lot of sense for fields of knowledge that we expect a lot of vandalism or suspect sources. I wish it were a bit easier to configure (i.e. 1 click to add and remove from suspect lists from the reports themselves). I also worry about generating these kinds of reports off-EnWiki, especially where WikiProjects don't exist widely Sadads (talk) 21:09, 14 August 2023 (UTC)

Product development
Offer us your most useful, targeted, idealistic, or critical feedback on the features you need or want. What's missing, what would make you adopt this tool, what would make it ten times better, what do you see as a subtle or obvious flaw? Now is the time to influence the direction of for what and how we build CREDBOT.
 * 1) I'm still just as confused now as I was at the Signpost submission pages. I see talks of frameworks and scalability, but I still have no idea how you use this project to do anything. I've looked at WP:VSAFE, and see (or at least can't recognize) no human-editable/configurable page or subpage. So again, I ask, what do I need to do to have something like this setup for WP:PHYS? I feel a step-by-step guide is needed. Do we need a list of bad/good sources? How do we build it? Are there templates or a specific bot-readable subpage/syntax, or is it just drawn from WP:RSP? How do we build a list of physics articles if we don't use categories or wikiproject tagging? Can we still use them anyway, or are they completely unsupported? &#32; Headbomb {t · c · p · b} 19:28, 6 August 2023 (UTC)
 * Headbomb, I think it's important to clarify that this is all still under development. We have a prototype for one project, Vaccine safety, and we are trying to gauge interest to see if projects would like to have similar tools. If we can demonstrate that this support exists, it will make it easier for User:Ocaasi and I to get funding to work on it. This will allow us to support more projects than just the current one. This is why we are requesting feedback. Harej (talk) 16:22, 8 August 2023 (UTC)
 * If you don't want to go directly to the CIA or DARPA, then talk to the US State Department, the EC censorship team, and major political parties of your choosing. Of course you can get funding for centralized control of "acceptable" sourcing for controling Wikipedia coverage of major controversial topics. You don't need to even ask. It's less clear why anyone outside of those circles would think this was in any way a good idea. — Llywelyn II   06:38, 13 August 2023 (UTC)
 * 1) Further to these useful queries from, I've looked at the reports and alerts. Surprisingly the only flagged alert is the result of something added by  himself. The displays look fine but it would be useful to have more evidence of how this would operate in practice, even in a restricted domain. Would the bot be installed by individual users or would it be operated at the level of a wikiproject? Has any consideration been given to a feature which would allow those interested to add sources which have been found to be unreliable or even to remove "unreliable sources" which have been found to be generally credible? Has any provision been made to report on the progress of work? WikiProjects Women in Red and Women were the first of over a hundred projects you invited to provide support or leave comments. Have you any indication that these projects have a particular problem with the credibility of the sources they use? My greatest concern is that the bot could be used by article reviewers to call for article deletions if the sources used, for example in a woman's biography, are flagged as insufficiently credible.--Ipigott (talk) 06:40, 7 August 2023 (UTC)
 * 2) It's not very clear from the scope and example Vaccine Safety report how this would work. The example Vaccine Safety alerts do not appear useful in that they are not necessarily flagging traits that require action (e.g., articles can use Twitter.com as a primary source so a link to the site does not necessarily require action). It would take specific scoping and a lot of configuration to make this effective.  czar  10:46, 7 August 2023 (UTC)