Objective: We have developed an automated knowledge base peer feedback system as part of an effort to facilitate the creation and refinement of sound clinical knowledge content within an enterprise-wide knowledge base. The program collects clinical data stored in our Clinical Data Repository during usage of a physician order entry program. It analyzes usage patterns of order sets relative to their templates and creates a report detailing the usage patterns of the order set template. This report includes a set of suggested modifications to the template.
Design: A quantitative analysis was performed to assess the validity of the program's suggested order set template modifications.
Measurements: We collected and de-identified 2951 instances of POE-based orders. Our program then identified and generated feedback reports for thirty different order set templates from this data set. These reports contained 500 suggested modifications. Five order set authors were then asked to 'accept' or 'reject' each suggestion contained in his/her respective order set templates. They were also asked to categorize their rationale for doing so ('clinical relevance' or 'user convenience').
Results: In total, 62% (309/500) suggestions were accepted by clinical content authors. Of these, authors accepted 32% (36/114) of the suggested additions, 74% (123/167) of the suggested pre-selections, 76% (16/25) of the suggested de-selections, and 68% (131/194) of the suggested changes in combo box order.
Conclusion: Overall, the feedback system generated suggestions that were deemed highly acceptable among order set authors. Future refinements and enhancements to the software will add to its utility.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jbi.2007.05.006 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!