We explored the nature of control during error correction using a modified saccadic double-step task in which subjects cancelled the initial saccade to the first target and redirected gaze to a second target. Failure to inhibit was associated with a quick corrective saccade, suggesting that errors and corrections may be planned concurrently. However, because saccade programming constitutes a visual and a motor stage of preparation, the extent to which parallel processing occurs in anticipation of the error is not known. To estimate the time course of error correction, a triple-step condition was introduced that displaced the second target during the error. In these trials, corrective saccades directed at the location of the target prior to the third step suggest motor preparation of the corrective saccade in parallel with the error. To estimate the time course of motor preparation of the corrective saccade, further, we used an accumulator model (LATER) to fit the reaction times to the triple-step stimuli; the best-fit data revealed that the onset of correction could occur even before the start of the error. The estimated start of motor correction was also observed to be delayed as target step delay decreased, suggesting a form of interference between concurrent motor programs. Taken together we interpret these results to indicate that predictive error correction may occur concurrently while the oculomotor system is trying to inhibit an unwanted movement and suggest how inhibitory control and error correction may interact to enable goal-directed behaviors.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1152/jn.90238.2008 | DOI Listing |
Heliyon
January 2025
College of Since and Art, Department of Mathematics, King Khalid University, Mahayil, Saudi Arabia.
New developments in the field of chemical graph theory have made it easier to comprehend how chemical structures relate to the graphs that underlie them on a more profound level using the ideas of classical graph theory. Chemical graphs can be effectively probed with the help of quantitative structure-property relationship (QSPR) analysis. In order to statistically correlate physical attributes.
View Article and Find Full Text PDFMed Phys
January 2025
Department of Radiology, University of Wisconsin-Madison, Madison, Wisconsin, USA.
Background: Histotripsy is a non-invasive, non-ionizing, non-thermal focused ultrasound technique. High amplitude short acoustic pulses converge to create high negative pressures that cavitate endogenous gas into a bubble cloud leading to mechanical tissue destruction. In the United States, histotripsy is approved to treat liver tumors under diagnostic ultrasound guidance but in initial clinical cases, some areas of the liver have not been treated due to bone or gas obstructing the acoustic window for targeting.
View Article and Find Full Text PDFJ Clin Epidemiol
January 2025
Wolfson Institute of Population Health, Queen Mary University of London, London, UK. Electronic address:
Questions often arise concerning when, whether and how we should adjust our interpretation of the results from multiple hypothesis tests. Strong arguments have been put forward in the epidemiological literature against any correction or adjustment for multiplicity, but regulatory requirements (particularly for pharmaceutical trials) can sometimes trump other concerns. The formal basis for adjustment is often the control of error rates, and hence the problems of multiplicity may seem rooted in a purely frequentist paradigm, though this can be a restrictive viewpoint.
View Article and Find Full Text PDFEJNMMI Phys
January 2025
Department of Medical Radiation Physics and Nuclear Medicine, Karolinska University Hospital, Solna, Sweden.
Background: System calibration is essential for accurate SPECT/CT dosimetry. However, count losses due to dead time and pulse pileup may cause calibration errors, in particular for I, where high count rates may be encountered. Calibration at low count rates should also be avoided to minimise detrimental effects from e.
View Article and Find Full Text PDFNat Commun
January 2025
Université de Lorraine, CNRS, Inria, LORIA, F-54000, Nancy, France.
The main obstacle to large scale quantum computing are the errors present in every physical qubit realization. Correcting these errors requires a large number of additional qubits. Two main avenues to reduce this overhead are (i) low-density parity check (LDPC) codes requiring very few additional qubits to correct errors (ii) cat qubits where bit-flip errors are exponentially suppressed by design.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!