Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from "effective interventions"? (b) How closely do results from these same analytic techniques concur with visual-analysis-based judgments of effective interventions? and (c) What role does autocorrelation play in interpretation of these analytic results? To answer these questions, five analytic techniques were compared with the judgments of 45 doctoral students and faculty, who rated intervention effectiveness from visual analysis of 35 fabricated AB design graphs. Implications for researchers and practitioners using single-case designs are discussed.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1177/0145445503261167 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!