Approximate quantities and exact number words: dissociable systems.

Neuropsychologia

INSERM U562, Neuroimagerie Cognitive Service Hospitalier Frédéric Joliot, CEA/DRM/DSV, 4 Place du General Leclerc, 91401 Cedex, Orsay, France.

Published: January 2004

Numerical abilities are thought to rest on the integration of two distinct systems, a verbal system of number words and a non-symbolic representation of approximate quantities. This view has lead to the classification of acalculias into two broad categories depending on whether the deficit affects the verbal or the quantity system. Here, we test the association of deficits predicted by this theory, and particularly the presence or absence of impairments in non-symbolic quantity processing. We describe two acalculic patients, one with a focal lesion of the left parietal lobe and Gerstmann's syndrome and another with semantic dementia with predominantly left temporal hypometabolism. As predicted by a quantity deficit, the first patient was more impaired in subtraction than in multiplication, showed a severe slowness in approximation, and exhibited associated impairments in subitizing and numerical comparison tasks, both with Arabic digits and with arrays of dots. As predicted by a verbal deficit, the second patient was more impaired in multiplication than in subtraction, had intact approximation abilities, and showed preserved processing of non-symbolic numerosities.

Download full-text PDF

Source
http://dx.doi.org/10.1016/s0028-3932(03)00123-4DOI Listing

Publication Analysis

Top Keywords

approximate quantities
8
patient impaired
8
quantities exact
4
exact number
4
number dissociable
4
dissociable systems
4
systems numerical
4
numerical abilities
4
abilities thought
4
thought rest
4

Similar Publications

Bioequivalence Design With Sampling Distribution Segments.

Stat Med

February 2025

Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, Ontario, Canada.

In bioequivalence design, power analyses dictate how much data must be collected to detect the absence of clinically important effects. Power is computed as a tail probability in the sampling distribution of the pertinent test statistics. When these test statistics cannot be constructed from pivotal quantities, their sampling distributions are approximated via repetitive, time-intensive computer simulation.

View Article and Find Full Text PDF

With the development of intelligent technology, data in practical applications show exponential growth in quantity and scale. Extracting the most distinguished attributes from complex datasets becomes a crucial problem. The existing attribute reduction approaches focus on the correlation between attributes and labels without considering the redundancy.

View Article and Find Full Text PDF

Shannon Entropy Computations in Navier-Stokes Flow Problems Using the Stochastic Finite Volume Method.

Entropy (Basel)

January 2025

Faculty of Civil Engineering, Architecture and Environmental Engineering, Lodz University of Technology, 90-924 Łódź, Poland.

The main aim of this study is to achieve the numerical solution for the Navier-Stokes equations for incompressible, non-turbulent, and subsonic fluid flows with some Gaussian physical uncertainties. The higher-order stochastic finite volume method (SFVM), implemented according to the iterative generalized stochastic perturbation technique and the Monte Carlo scheme, are engaged for this purpose. It is implemented with the aid of the polynomial bases for the pressure-velocity-temperature (PVT) solutions, for which the weighted least squares method (WLSM) algorithm is applicable.

View Article and Find Full Text PDF

With contemporary anesthetic drugs, the efficacy of general anesthesia is assured. Health-economic and clinical objectives are related to reductions in the variability in dosing, variability in recovery, etc. Consequently, meta-analyses for anesthesiology research would benefit from quantification of ratios of standard deviations of log-normally distributed variables (e.

View Article and Find Full Text PDF

The finite-element method (FEM) is a well-established procedure for computing approximate solutions to deterministic engineering problems described by partial differential equations. FEM produces discrete approximations of the solution with a discretisation error that can be quantified with a posteriori error estimates. The practical relevance of error estimates for biomechanics problems, especially for soft tissue where the response is governed by large strains, is rarely addressed.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!