Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
The latest national science framework has formally stated the need for developing assessments that test both students' content knowledge and scientific practices. In response to this call, a science assessment that consists of (a) content items that measure students' understanding of a grade eight physics topic and (b) argumentation items that measure students' argumentation competency has been developed. This paper investigated the function of these content and argumentation items with a multidimensional measurement framework from two perspectives. First, we performed a dimensionality analysis to investigate whether the relationship between the content and argumentation items conformed to test deign. Second, we conducted a differential item functioning analysis in the multidimensional framework to examine if any content or argumentation item unfairly favored students with an advanced level of English literacy. Methods and findings of this study could inform future research on the validation of assessments measuring higher-order and complex abilities.
Download full-text PDF |
Source |
---|
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!