Suppression of noise from recorded signals is a critically important data processing step for biomechanical analyses. While a wide variety of filtering or smoothing spline methods are available, the majority of these are not well suited for the analysis of signals with rapidly changing derivatives such as the processing of motion data for impact-like events. This is because commonly used low-pass filtering approaches or smoothing splines typically assume a single fixed cut-off frequency or regularization penalty which fails to describe rapid changes in the underlying function.
View Article and Find Full Text PDFAn improved estimator of genomic relatedness using low-depth high-throughput sequencing data for autopolyploids is developed. Its outputs strongly correlate with SNP array-based estimates and are available in the package GUSrelate. High-throughput sequencing (HTS) methods have reduced sequencing costs and resources compared to array-based tools, facilitating the investigation of many non-model polyploid species.
View Article and Find Full Text PDFBackground: The functional and metabolic properties of skeletal muscles are partly a function of the spatial arrangement of fibers across the muscle belly. Many muscles feature a non-uniform spatial pattern of fiber types, and alterations to the arrangement can reflect age or disease and correlate with changes in muscle mass and strength. Despite the significance of this event, descriptions of spatial fiber-type distributions across a muscle section are mainly provided qualitatively, by eye.
View Article and Find Full Text PDFWe consider estimator and model choice when estimating abundance from capture-recapture data. Our work is motivated by a mark-recapture distance sampling example, where model and estimator choice led to unexpectedly large disparities in the estimates. To understand these differences, we look at three estimation strategies (maximum likelihood estimation, conditional maximum likelihood estimation, and Bayesian estimation) for both binomial and Poisson models.
View Article and Find Full Text PDFBayesian methods have recently been proposed to solve inverse kinematics problems for marker based motion capture. The objective is to find the posterior distribution, a probabilistic summary of our knowledge and corresponding uncertainty about the model parameters such as joint angles, segment angles, segment translations, and marker positions. To date, Bayesian inverse kinematics models have focused on a frame by frame solution, which if repeatedly applied gives estimates that are discontinuous in time.
View Article and Find Full Text PDFBayesian inference has recently been identified as an approach for estimating a subjects' pose from noisy marker position data. Previous research suggests that Bayesian inference markedly reduces error for inverse kinematic problems relative to traditional least-squares approaches with estimators having reduced variance despite both least-squares and Bayesian estimators being unbiased. This result is surprising as Bayesian estimators are typically similar to least-squares approaches unless highly informative prior distributions are used.
View Article and Find Full Text PDFIt is difficult to estimate sensitivity and specificity of diagnostic tests when there is no gold standard. Latent class models have been proposed as a potential solution as they provide estimates without the need for a gold standard. Using a motivating example of the evaluation of point of care tests for leptospirosis in Tanzania, we show how a realistic violation of assumptions underpinning the latent class model can lead directly to substantial bias in the estimates of the parameters of interest.
View Article and Find Full Text PDFInfants' avoidance of drop-offs has been described as an affordance learning that is not transferable between different locomotor postures. In addition, there is evidence that infants perceive and act similarly around real and water cliffs. This cross-sectional study investigated the effects of specific locomotor experiences on infants' avoidance behaviour using the Real Cliff/Water Cliff paradigm.
View Article and Find Full Text PDFA spatial open-population capture-recapture model is described that extends both the non-spatial open-population model of Schwarz and Arnason and the spatially explicit closed-population model of Borchers and Efford. The superpopulation of animals available for detection at some time during a study is conceived as a two-dimensional Poisson point process. Individual probabilities of birth and death follow the conventional open-population model.
View Article and Find Full Text PDFRationale: Stable isotope ratios can provide a 'fingerprint' to enable differentiation of sources of monofluoroacetate (MFA), hence providing a means to eliminate potential sources of MFA in a blackmail case involving the contamination of milk.
Methods: The stable isotopic compositions (δ H, δ C and δ O values) of a library of 43 samples of MFA were determined and multivariate models constructed to differentiate samples of different composition. The data from the MFA library were compared with those obtained from MFA extracted from contaminated milk powder (the case samples).
N-mixture models provide an appealing alternative to mark-recapture models, in that they allow for estimation of detection probability and population size from count data, without requiring that individual animals be identified. There is, however, a cost to using the N-mixture models: inference is very sensitive to the model's assumptions. We consider the effects of three violations of assumptions that might reasonably be expected in practice: double counting, unmodeled variation in population size over time, and unmodeled variation in detection probability over time.
View Article and Find Full Text PDFNext-generation sequencing is an efficient method that allows for substantially more markers than previous technologies, providing opportunities for building high-density genetic linkage maps, which facilitate the development of nonmodel species' genomic assemblies and the investigation of their genes. However, constructing genetic maps using data generated via high-throughput sequencing technology (, genotyping-by-sequencing) is complicated by the presence of sequencing errors and genotyping errors resulting from missing parental alleles due to low sequencing depth. If unaccounted for, these errors lead to inflated genetic maps.
View Article and Find Full Text PDFThe standard approach to fitting capture-recapture data collected in continuous time involves arbitrarily forcing the data into a series of distinct discrete capture sessions. We show how continuous-time models can be fitted as easily as discrete-time alternatives. The likelihood is factored so that efficient Markov chain Monte Carlo algorithms can be implemented for Bayesian estimation, available online in the R package ctime.
View Article and Find Full Text PDFN-mixture models describe count data replicated in time and across sites in terms of abundance N and detectability p. They are popular because they allow inference about N while controlling for factors that influence p without the need for marking animals. Using a capture-recapture perspective, we show that the loss of information that results from not marking animals is critical, making reliable statistical modeling of N and p problematic using just count data.
View Article and Find Full Text PDFAtten Percept Psychophys
April 2016
We investigated whether distance estimation accuracy over open water is influenced by the viewing direction of the observer. Twenty-two healthy students (9 male, 13 female) made 10 distance estimates ranging between 50 and 950 m actual distance in 2 viewing conditions: (1) from shore to boat and (2) from boat to shore. There were no consistent differences in estimation accuracy between viewing directions.
View Article and Find Full Text PDFLink et al. (2010, Biometrics 66, 178-185) define a general framework for analyzing capture-recapture data with potential misidentifications. In this framework, the observed vector of counts, y, is considered as a linear function of a vector of latent counts, x, such that y=Ax, with x assumed to follow a multinomial distribution conditional on the model parameters, θ.
View Article and Find Full Text PDFMotivated by field sampling of DNA fragments, we describe a general model for capture-recapture modeling of samples drawn one at a time in continuous-time. Our model is based on Poisson sampling where the sampling time may be unobserved. We show that previously described models correspond to partial likelihoods from our Poisson model and their use may be justified through arguments concerning S- and Bayes-ancillarity of discarded information.
View Article and Find Full Text PDFA major goal of gut-content analysis is to quantify predation rates by predators in the field, which could provide insights into the mechanisms behind ecosystem structure and function, as well as quantification of ecosystem services provided. However, percentage-positive results from molecular assays are strongly influenced by factors other than predation rate, and thus can only be reliably used to quantify predation rates under very restrictive conditions. Here, we develop two statistical approaches, one using a parametric bootstrap and the other in terms of Bayesian inference, to build upon previous techniques that use DNA decay rates to rank predators by their rate of prey consumption, by allowing a statistical assessment of confidence in the inferred ranking.
View Article and Find Full Text PDFWe use Bayesian methods to explore fitting the von Bertalanffy length model to tag-recapture data. We consider two popular parameterizations of the von Bertalanffy model. The first models the data relative to age at first capture; the second models in terms of length at first capture.
View Article and Find Full Text PDFIn a recent commentary on statistical inference, Batterham and Hopkins advocated an approach to statistical inference centered on expressions of uncertainty in parameters. After criticizing an approach to statistical inference driven by null hypothesis testing, they proposed a method of "magnitude-based" inference and then claimed that this approach is essentially Bayesian but with no prior assumption about the true value of the parameter. In this commentary, after we address the issues raised by Batterham and Hopkins, we show that their method is "approximately" Bayesian and rather than assuming no prior information their approach has a very specific, but hidden, joint prior on parameters.
View Article and Find Full Text PDFSampling DNA noninvasively has advantages for identifying animals for uses such as mark-recapture modeling that require unique identification of animals in samples. Although it is possible to generate large amounts of data from noninvasive sources of DNA, a challenge is overcoming genotyping errors that can lead to incorrect identification of individuals. A major source of error is allelic dropout, which is failure of DNA amplification at one or more loci.
View Article and Find Full Text PDFMany organisms are patchily distributed, with some patches occupied at high density, others at lower densities, and others not occupied. Estimation of overall abundance can be difficult and is inefficient via intensive approaches such as capture-mark-recapture (CMR) or distance sampling. We propose a two-phase sampling scheme and model in a Bayesian framework to estimate abundance for patchily distributed populations.
View Article and Find Full Text PDF