Wikipedia, a paradigmatic example of online knowledge space is organized in a collaborative, bottom-up way with voluntary contributions, yet it maintains a level of reliability comparable to that of traditional encyclopedias. The lack of selected professional writers and editors makes the judgement about quality and trustworthiness of the articles a real challenge. Here we show that a self-consistent metrics for the network defined by the edit records captures well the character of editors' activity and the articles' level of complexity. Using our metrics, one can better identify the human-labeled high-quality articles, e.g., "featured" ones, and differentiate them from the popular and controversial articles. Furthermore, the dynamics of the editor-article system is also well captured by the metrics, revealing the evolutionary pathways of articles and diverse roles of editors. We demonstrate that the collective effort of the editors indeed drives to the direction of article improvement.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8443573 | PMC |
http://dx.doi.org/10.1038/s41598-021-97755-w | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!