Sample records for joint likelihood analysis

  1. Mixture Rasch Models with Joint Maximum Likelihood Estimation

    ERIC Educational Resources Information Center

    Willse, John T.

    2011-01-01

    This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum likelihood is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…

  2. Maximum-likelihood techniques for joint segmentation-classification of multispectral chromosome images.

    PubMed

    Schwartzkopf, Wade C; Bovik, Alan C; Evans, Brian L

    2005-12-01

    Traditional chromosome imaging has been limited to grayscale images, but recently a 5-fluorophore combinatorial labeling technique (M-FISH) was developed wherein each class of chromosomes binds with a different combination of fluorophores. This results in a multispectral image, where each class of chromosomes has distinct spectral components. In this paper, we develop new methods for automatic chromosome identification by exploiting the multispectral information in M-FISH chromosome images and by jointly performing chromosome segmentation and classification. We (1) develop a maximum-likelihood hypothesis test that uses multispectral information, together with conventional criteria, to select the best segmentation possibility; (2) use this likelihood function to combine chromosome segmentation and classification into a robust chromosome identification system; and (3) show that the proposed likelihood function can also be used as a reliable indicator of errors in segmentation, errors in classification, and chromosome anomalies, which can be indicators of radiation damage, cancer, and a wide variety of inherited diseases. We show that the proposed multispectral joint segmentation-classification method outperforms past grayscale segmentation methods when decomposing touching chromosomes. We also show that it outperforms past M-FISH classification techniques that do not use segmentation information.

  3. Joint maximum-likelihood magnitudes of presumed underground nuclear test explosions

    NASA Astrophysics Data System (ADS)

    Peacock, Sheila; Douglas, Alan; Bowers, David

    2017-08-01

    Body-wave magnitudes (mb) of 606 seismic disturbances caused by presumed underground nuclear test explosions at specific test sites between 1964 and 1996 have been derived from station amplitudes collected by the International Seismological Centre (ISC), by a joint inversion for mb and station-specific magnitude corrections. A maximum-likelihood method was used to reduce the upward bias of network mean magnitudes caused by data censoring, where arrivals at stations that do not report arrivals are assumed to be hidden by the ambient noise at the time. Threshold noise levels at each station were derived from the ISC amplitudes using the method of Kelly and Lacoss, which fits to the observed magnitude-frequency distribution a Gutenberg-Richter exponential decay truncated at low magnitudes by an error function representing the low-magnitude threshold of the station. The joint maximum-likelihood inversion is applied to arrivals from the sites: Semipalatinsk (Kazakhstan) and Novaya Zemlya, former Soviet Union; Singer (Lop Nor), China; Mururoa and Fangataufa, French Polynesia; and Nevada, USA. At sites where eight or more arrivals could be used to derive magnitudes and station terms for 25 or more explosions (Nevada, Semipalatinsk and Mururoa), the resulting magnitudes and station terms were fixed and a second inversion carried out to derive magnitudes for additional explosions with three or more arrivals. 93 more magnitudes were thus derived. During processing for station thresholds, many stations were rejected for sparsity of data, obvious errors in reported amplitude, or great departure of the reported amplitude-frequency distribution from the expected left-truncated exponential decay. Abrupt changes in monthly mean amplitude at a station apparently coincide with changes in recording equipment and/or analysis method at the station.

  4. Hybrid pairwise likelihood analysis of animal behavior experiments.

    PubMed

    Cattelan, Manuela; Varin, Cristiano

    2013-12-01

    The study of the determinants of fights between animals is an important issue in understanding animal behavior. For this purpose, tournament experiments among a set of animals are often used by zoologists. The results of these tournament experiments are naturally analyzed by paired comparison models. Proper statistical analysis of these models is complicated by the presence of dependence between the outcomes of fights because the same animal is involved in different contests. This paper discusses two different model specifications to account for between-fights dependence. Models are fitted through the hybrid pairwise likelihood method that iterates between optimal estimating equations for the regression parameters and pairwise likelihood inference for the association parameters. This approach requires the specification of means and covariances only. For this reason, the method can be applied also when the computation of the joint distribution is difficult or inconvenient. The proposed methodology is investigated by simulation studies and applied to real data about adult male Cape Dwarf Chameleons. © 2013, The International Biometric Society.

  5. dPIRPLE: a joint estimation framework for deformable registration and penalized-likelihood CT image reconstruction using prior images

    NASA Astrophysics Data System (ADS)

    Dang, H.; Wang, A. S.; Sussman, Marc S.; Siewerdsen, J. H.; Stayman, J. W.

    2014-09-01

    Sequential imaging studies are conducted in many clinical scenarios. Prior images from previous studies contain a great deal of patient-specific anatomical information and can be used in conjunction with subsequent imaging acquisitions to maintain image quality while enabling radiation dose reduction (e.g., through sparse angular sampling, reduction in fluence, etc). However, patient motion between images in such sequences results in misregistration between the prior image and current anatomy. Existing prior-image-based approaches often include only a simple rigid registration step that can be insufficient for capturing complex anatomical motion, introducing detrimental effects in subsequent image reconstruction. In this work, we propose a joint framework that estimates the 3D deformation between an unregistered prior image and the current anatomy (based on a subsequent data acquisition) and reconstructs the current anatomical image using a model-based reconstruction approach that includes regularization based on the deformed prior image. This framework is referred to as deformable prior image registration, penalized-likelihood estimation (dPIRPLE). Central to this framework is the inclusion of a 3D B-spline-based free-form-deformation model into the joint registration-reconstruction objective function. The proposed framework is solved using a maximization strategy whereby alternating updates to the registration parameters and image estimates are applied allowing for improvements in both the registration and reconstruction throughout the optimization process. Cadaver experiments were conducted on a cone-beam CT testbench emulating a lung nodule surveillance scenario. Superior reconstruction accuracy and image quality were demonstrated using the dPIRPLE algorithm as compared to more traditional reconstruction methods including filtered backprojection, penalized-likelihood estimation (PLE), prior image penalized-likelihood estimation (PIPLE) without registration, and

  6. A Likelihood-Based Framework for Association Analysis of Allele-Specific Copy Numbers.

    PubMed

    Hu, Y J; Lin, D Y; Sun, W; Zeng, D

    2014-10-01

    Copy number variants (CNVs) and single nucleotide polymorphisms (SNPs) co-exist throughout the human genome and jointly contribute to phenotypic variations. Thus, it is desirable to consider both types of variants, as characterized by allele-specific copy numbers (ASCNs), in association studies of complex human diseases. Current SNP genotyping technologies capture the CNV and SNP information simultaneously via fluorescent intensity measurements. The common practice of calling ASCNs from the intensity measurements and then using the ASCN calls in downstream association analysis has important limitations. First, the association tests are prone to false-positive findings when differential measurement errors between cases and controls arise from differences in DNA quality or handling. Second, the uncertainties in the ASCN calls are ignored. We present a general framework for the integrated analysis of CNVs and SNPs, including the analysis of total copy numbers as a special case. Our approach combines the ASCN calling and the association analysis into a single step while allowing for differential measurement errors. We construct likelihood functions that properly account for case-control sampling and measurement errors. We establish the asymptotic properties of the maximum likelihood estimators and develop EM algorithms to implement the corresponding inference procedures. The advantages of the proposed methods over the existing ones are demonstrated through realistic simulation studies and an application to a genome-wide association study of schizophrenia. Extensions to next-generation sequencing data are discussed.

  7. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-07-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper, we use massive asymptotically optimal data compression to reduce the dimensionality of the data space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parametrized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate DELFI with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological data sets.

  8. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    PubMed

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  9. Joint reconstruction of activity and attenuation in Time-of-Flight PET: A Quantitative Analysis.

    PubMed

    Rezaei, Ahmadreza; Deroose, Christophe M; Vahle, Thomas; Boada, Fernando; Nuyts, Johan

    2018-03-01

    Joint activity and attenuation reconstruction methods from time of flight (TOF) positron emission tomography (PET) data provide an effective solution to attenuation correction when no (or incomplete/inaccurate) information on the attenuation is available. One of the main barriers limiting their use in clinical practice is the lack of validation of these methods on a relatively large patient database. In this contribution, we aim at validating the activity reconstructions of the maximum likelihood activity reconstruction and attenuation registration (MLRR) algorithm on a whole-body patient data set. Furthermore, a partial validation (since the scale problem of the algorithm is avoided for now) of the maximum likelihood activity and attenuation reconstruction (MLAA) algorithm is also provided. We present a quantitative comparison of the joint reconstructions to the current clinical gold-standard maximum likelihood expectation maximization (MLEM) reconstruction with CT-based attenuation correction. Methods: The whole-body TOF-PET emission data of each patient data set is processed as a whole to reconstruct an activity volume covering all the acquired bed positions, which helps to reduce the problem of a scale per bed position in MLAA to a global scale for the entire activity volume. Three reconstruction algorithms are used: MLEM, MLRR and MLAA. A maximum likelihood (ML) scaling of the single scatter simulation (SSS) estimate to the emission data is used for scatter correction. The reconstruction results are then analyzed in different regions of interest. Results: The joint reconstructions of the whole-body patient data set provide better quantification in case of PET and CT misalignments caused by patient and organ motion. Our quantitative analysis shows a difference of -4.2% (±2.3%) and -7.5% (±4.6%) between the joint reconstructions of MLRR and MLAA compared to MLEM, averaged over all regions of interest, respectively. Conclusion: Joint activity and attenuation

  10. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.

    We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expressionmore » that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.« less

  11. Maximum Likelihood Analysis in the PEN Experiment

    NASA Astrophysics Data System (ADS)

    Lehman, Martin

    2013-10-01

    The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.

  12. Robust Multipoint Water-Fat Separation Using Fat Likelihood Analysis

    PubMed Central

    Yu, Huanzhou; Reeder, Scott B.; Shimakawa, Ann; McKenzie, Charles A.; Brittain, Jean H.

    2016-01-01

    Fat suppression is an essential part of routine MRI scanning. Multiecho chemical-shift based water-fat separation methods estimate and correct for Bo field inhomogeneity. However, they must contend with the intrinsic challenge of water-fat ambiguity that can result in water-fat swapping. This problem arises because the signals from two chemical species, when both are modeled as a single discrete spectral peak, may appear indistinguishable in the presence of Bo off-resonance. In conventional methods, the water-fat ambiguity is typically removed by enforcing field map smoothness using region growing based algorithms. In reality, the fat spectrum has multiple spectral peaks. Using this spectral complexity, we introduce a novel concept that identifies water and fat for multiecho acquisitions by exploiting the spectral differences between water and fat. A fat likelihood map is produced to indicate if a pixel is likely to be water-dominant or fat-dominant by comparing the fitting residuals of two different signal models. The fat likelihood analysis and field map smoothness provide complementary information, and we designed an algorithm (Fat Likelihood Analysis for Multiecho Signals) to exploit both mechanisms. It is demonstrated in a wide variety of data that the Fat Likelihood Analysis for Multiecho Signals algorithm offers highly robust water-fat separation for 6-echo acquisitions, particularly in some previously challenging applications. PMID:21842498

  13. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures

    PubMed Central

    Theobald, Douglas L.; Wuttke, Deborah S.

    2008-01-01

    Summary THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. PMID:16777907

  14. Joint penalized-likelihood reconstruction of time-activity curves and regions-of-interest from projection data in brain PET

    NASA Astrophysics Data System (ADS)

    Krestyannikov, E.; Tohka, J.; Ruotsalainen, U.

    2008-06-01

    This paper presents a novel statistical approach for joint estimation of regions-of-interest (ROIs) and the corresponding time-activity curves (TACs) from dynamic positron emission tomography (PET) brain projection data. It is based on optimizing the joint objective function that consists of a data log-likelihood term and two penalty terms reflecting the available a priori information about the human brain anatomy. The developed local optimization strategy iteratively updates both the ROI and TAC parameters and is guaranteed to monotonically increase the objective function. The quantitative evaluation of the algorithm is performed with numerically and Monte Carlo-simulated dynamic PET brain data of the 11C-Raclopride and 18F-FDG tracers. The results demonstrate that the method outperforms the existing sequential ROI quantification approaches in terms of accuracy, and can noticeably reduce the errors in TACs arising due to the finite spatial resolution and ROI delineation.

  15. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    PubMed

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  16. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    A space suit s mobility is critical to an astronaut s ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. The term mobility, with respect to space suits, is defined in terms of two key components: joint range of motion and joint torque. Individually these measures describe the path which in which a joint travels and the force required to move it through that path. Previous space suits mobility requirements were defined as the collective result of these two measures and verified by the completion of discrete functional tasks. While a valid way to impose mobility requirements, such a method does necessitate a solid understanding of the operational scenarios in which the final suit will be performing. Because the Constellation space suit system requirements are being finalized with a relatively immature concept of operations, the Space Suit Element team elected to define mobility in terms of its constituent parts to increase the likelihood that the future pressure garment will be mobile enough to enable a broad scope of undefined exploration activities. The range of motion requirements were defined by measuring the ranges of motion test subjects achieved while performing a series of joint maximizing tasks in a variety of flight and prototype space suits. The definition of joint torque requirements has proved more elusive. NASA evaluated several different approaches to the problem before deciding to generate requirements based on unmanned joint torque evaluations of six different space suit configurations being articulated through 16 separate joint movements. This paper discusses the experiment design, data analysis and results, and the process used to determine the final values for the Constellation pressure garment joint torque requirements.

  17. Joint Maximum Likelihood Time Delay Estimation of Unknown Event-Related Potential Signals for EEG Sensor Signal Quality Enhancement

    PubMed Central

    Kim, Kyungsoo; Lim, Sung-Ho; Lee, Jaeseok; Kang, Won-Seok; Moon, Cheil; Choi, Ji-Woong

    2016-01-01

    Electroencephalograms (EEGs) measure a brain signal that contains abundant information about the human brain function and health. For this reason, recent clinical brain research and brain computer interface (BCI) studies use EEG signals in many applications. Due to the significant noise in EEG traces, signal processing to enhance the signal to noise power ratio (SNR) is necessary for EEG analysis, especially for non-invasive EEG. A typical method to improve the SNR is averaging many trials of event related potential (ERP) signal that represents a brain’s response to a particular stimulus or a task. The averaging, however, is very sensitive to variable delays. In this study, we propose two time delay estimation (TDE) schemes based on a joint maximum likelihood (ML) criterion to compensate the uncertain delays which may be different in each trial. We evaluate the performance for different types of signals such as random, deterministic, and real EEG signals. The results show that the proposed schemes provide better performance than other conventional schemes employing averaged signal as a reference, e.g., up to 4 dB gain at the expected delay error of 10°. PMID:27322267

  18. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  19. Climate reconstruction analysis using coexistence likelihood estimation (CRACLE): a method for the estimation of climate using vegetation.

    PubMed

    Harbert, Robert S; Nixon, Kevin C

    2015-08-01

    • Plant distributions have long been understood to be correlated with the environmental conditions to which species are adapted. Climate is one of the major components driving species distributions. Therefore, it is expected that the plants coexisting in a community are reflective of the local environment, particularly climate.• Presented here is a method for the estimation of climate from local plant species coexistence data. The method, Climate Reconstruction Analysis using Coexistence Likelihood Estimation (CRACLE), is a likelihood-based method that employs specimen collection data at a global scale for the inference of species climate tolerance. CRACLE calculates the maximum joint likelihood of coexistence given individual species climate tolerance characterization to estimate the expected climate.• Plant distribution data for more than 4000 species were used to show that this method accurately infers expected climate profiles for 165 sites with diverse climatic conditions. Estimates differ from the WorldClim global climate model by less than 1.5°C on average for mean annual temperature and less than ∼250 mm for mean annual precipitation. This is a significant improvement upon other plant-based climate-proxy methods.• CRACLE validates long hypothesized interactions between climate and local associations of plant species. Furthermore, CRACLE successfully estimates climate that is consistent with the widely used WorldClim model and therefore may be applied to the quantitative estimation of paleoclimate in future studies. © 2015 Botanical Society of America, Inc.

  20. Maximum likelihood estimation of signal-to-noise ratio and combiner weight

    NASA Technical Reports Server (NTRS)

    Kalson, S.; Dolinar, S. J.

    1986-01-01

    An algorithm for estimating signal to noise ratio and combiner weight parameters for a discrete time series is presented. The algorithm is based upon the joint maximum likelihood estimate of the signal and noise power. The discrete-time series are the sufficient statistics obtained after matched filtering of a biphase modulated signal in additive white Gaussian noise, before maximum likelihood decoding is performed.

  1. Analysis of the Effects of Normal Walking on Ankle Joint Contact Characteristics After Acute Inversion Ankle Sprain.

    PubMed

    Bae, Ji Yong; Park, Kyung Soon; Seon, Jong Keun; Jeon, Insu

    2015-12-01

    To show the causal relationship between normal walking after various lateral ankle ligament (LAL) injuries caused by acute inversion ankle sprains and alterations in ankle joint contact characteristics, finite element simulations of normal walking were carried out using an intact ankle joint model and LAL injury models. A walking experiment using a volunteer with a normal ankle joint was performed to obtain the boundary conditions for the simulations and to support the appropriateness of the simulation results. Contact pressure and strain on the talus articular cartilage and anteroposterior and mediolateral translations of the talus were calculated. Ankles with ruptured anterior talofibular ligaments (ATFLs) had a higher likelihood of experiencing increased ankle joint contact pressures, strains and translations than ATFL-deficient ankles. In particular, ankles with ruptured ATFL + calcaneofibular ligaments and all ruptured ankles had a similar likelihood as the ATFL-ruptured ankles. The push off stance phase was the most likely situation for increased ankle joint contact pressures, strains and translations in LAL-injured ankles.

  2. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  3. A joint frailty-copula model between tumour progression and death for meta-analysis.

    PubMed

    Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie

    2017-12-01

    Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.

  4. Exact likelihood evaluations and foreground marginalization in low resolution WMAP data

    NASA Astrophysics Data System (ADS)

    Slosar, Anže; Seljak, Uroš; Makarov, Alexey

    2004-06-01

    The large scale anisotropies of Wilkinson Microwave Anisotropy Probe (WMAP) data have attracted a lot of attention and have been a source of controversy, with many favorite cosmological models being apparently disfavored by the power spectrum estimates at low l. All the existing analyses of theoretical models are based on approximations for the likelihood function, which are likely to be inaccurate on large scales. Here we present exact evaluations of the likelihood of the low multipoles by direct inversion of the theoretical covariance matrix for low resolution WMAP maps. We project out the unwanted galactic contaminants using the WMAP derived maps of these foregrounds. This improves over the template based foreground subtraction used in the original analysis, which can remove some of the cosmological signal and may lead to a suppression of power. As a result we find an increase in power at low multipoles. For the quadrupole the maximum likelihood values are rather uncertain and vary between 140 and 220 μK2. On the other hand, the probability distribution away from the peak is robust and, assuming a uniform prior between 0 and 2000 μK2, the probability of having the true value above 1200 μK2 (as predicted by the simplest cold dark matter model with a cosmological constant) is 10%, a factor of 2.5 higher than predicted by the WMAP likelihood code. We do not find the correlation function to be unusual beyond the low quadrupole value. We develop a fast likelihood evaluation routine that can be used instead of WMAP routines for low l values. We apply it to the Markov chain Monte Carlo analysis to compare the cosmological parameters between the two cases. The new analysis of WMAP either alone or jointly with the Sloan Digital Sky Survey (SDSS) and the Very Small Array (VSA) data reduces the evidence for running to less than 1σ, giving αs=-0.022±0.033 for the combined case. The new analysis prefers about a 1σ lower value of Ωm, a consequence of an increased

  5. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  6. Transfer entropy as a log-likelihood ratio.

    PubMed

    Barnett, Lionel; Bossomaier, Terry

    2012-09-28

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  7. Likelihood analysis of spatial capture-recapture models for stratified or class structured populations

    USGS Publications Warehouse

    Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.

    2015-01-01

    We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.

  8. A likelihood framework for joint estimation of salmon abundance and migratory timing using telemetric mark-recapture

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Gates, Kenneth S.; Palmer, Douglas E.

    2010-01-01

    Many fisheries for Pacific salmon Oncorhynchus spp. are actively managed to meet escapement goal objectives. In fisheries where the demand for surplus production is high, an extensive assessment program is needed to achieve the opposing objectives of allowing adequate escapement and fully exploiting the available surplus. Knowledge of abundance is a critical element of such assessment programs. Abundance estimation using mark—recapture experiments in combination with telemetry has become common in recent years, particularly within Alaskan river systems. Fish are typically captured and marked in the lower river while migrating in aggregations of individuals from multiple populations. Recapture data are obtained using telemetry receivers that are co-located with abundance assessment projects near spawning areas, which provide large sample sizes and information on population-specific mark rates. When recapture data are obtained from multiple populations, unequal mark rates may reflect a violation of the assumption of homogeneous capture probabilities. A common analytical strategy is to test the hypothesis that mark rates are homogeneous and combine all recapture data if the test is not significant. However, mark rates are often low, and a test of homogeneity may lack sufficient power to detect meaningful differences among populations. In addition, differences among mark rates may provide information that could be exploited during parameter estimation. We present a temporally stratified mark—recapture model that permits capture probabilities and migratory timing through the capture area to vary among strata. Abundance information obtained from a subset of populations after the populations have segregated for spawning is jointly modeled with telemetry distribution data by use of a likelihood function. Maximization of the likelihood produces estimates of the abundance and timing of individual populations migrating through the capture area, thus yielding

  9. Joint effect of MCP-1 genotype GG and MMP-1 genotype 2G/2G increases the likelihood of developing pulmonary tuberculosis in BCG-vaccinated individuals.

    PubMed

    Ganachari, Malathesha; Ruiz-Morales, Jorge A; Gomez de la Torre Pretell, Juan C; Dinh, Jeffrey; Granados, Julio; Flores-Villanueva, Pedro O

    2010-01-25

    We previously reported that the -2518 MCP-1 genotype GG increases the likelihood of developing tuberculosis (TB) in non-BCG-vaccinated Mexicans and Koreans. Here, we tested the hypothesis that this genotype, alone or together with the -1607 MMP-1 functional polymorphism, increases the likelihood of developing TB in BCG-vaccinated individuals. We conducted population-based case-control studies of BCG-vaccinated individuals in Mexico and Peru that included 193 TB cases and 243 healthy tuberculin-positive controls from Mexico and 701 TB cases and 796 controls from Peru. We also performed immunohistochemistry (IHC) analysis of lymph nodes from carriers of relevant two-locus genotypes and in vitro studies to determine how these variants may operate to increase the risk of developing active disease. We report that a joint effect between the -2518 MCP-1 genotype GG and the -1607 MMP-1 genotype 2G/2G consistently increases the odds of developing TB 3.59-fold in Mexicans and 3.9-fold in Peruvians. IHC analysis of lymph nodes indicated that carriers of the two-locus genotype MCP-1 GG MMP-1 2G/2G express the highest levels of both MCP-1 and MMP-1. Carriers of these susceptibility genotypes might be at increased risk of developing TB because they produce high levels of MCP-1, which enhances the induction of MMP-1 production by M. tuberculosis-sonicate antigens to higher levels than in carriers of the other two-locus MCP-1 MMP-1 genotypes studied. This notion was supported by in vitro experiments and luciferase based promoter activity assay. MMP-1 may destabilize granuloma formation and promote tissue damage and disease progression early in the infection. Our findings may foster the development of new and personalized therapeutic approaches targeting MCP-1 and/or MMP-1.

  10. Joint Effect of MCP-1 Genotype GG and MMP-1 Genotype 2G/2G Increases the Likelihood of Developing Pulmonary Tuberculosis in BCG-Vaccinated Individuals

    PubMed Central

    Ganachari, Malathesha; Ruiz-Morales, Jorge A.; Gomez de la Torre Pretell, Juan C.; Dinh, Jeffrey; Granados, Julio; Flores-Villanueva, Pedro O.

    2010-01-01

    We previously reported that the – 2518 MCP-1 genotype GG increases the likelihood of developing tuberculosis (TB) in non-BCG-vaccinated Mexicans and Koreans. Here, we tested the hypothesis that this genotype, alone or together with the – 1607 MMP-1 functional polymorphism, increases the likelihood of developing TB in BCG-vaccinated individuals. We conducted population-based case-control studies of BCG-vaccinated individuals in Mexico and Peru that included 193 TB cases and 243 healthy tuberculin-positive controls from Mexico and 701 TB cases and 796 controls from Peru. We also performed immunohistochemistry (IHC) analysis of lymph nodes from carriers of relevant two-locus genotypes and in vitro studies to determine how these variants may operate to increase the risk of developing active disease. We report that a joint effect between the – 2518 MCP-1 genotype GG and the – 1607 MMP-1 genotype 2G/2G consistently increases the odds of developing TB 3.59-fold in Mexicans and 3.9-fold in Peruvians. IHC analysis of lymph nodes indicated that carriers of the two-locus genotype MCP-1 GG MMP-1 2G/2G express the highest levels of both MCP-1 and MMP-1. Carriers of these susceptibility genotypes might be at increased risk of developing TB because they produce high levels of MCP-1, which enhances the induction of MMP-1 production by M. tuberculosis-sonicate antigens to higher levels than in carriers of the other two-locus MCP-1 MMP-1 genotypes studied. This notion was supported by in vitro experiments and luciferase based promoter activity assay. MMP-1 may destabilize granuloma formation and promote tissue damage and disease progression early in the infection. Our findings may foster the development of new and personalized therapeutic approaches targeting MCP-1 and/or MMP-1. PMID:20111728

  11. Combining classifiers using their receiver operating characteristics and maximum likelihood estimation.

    PubMed

    Haker, Steven; Wells, William M; Warfield, Simon K; Talos, Ion-Florin; Bhagwat, Jui G; Goldberg-Zimring, Daniel; Mian, Asim; Ohno-Machado, Lucila; Zou, Kelly H

    2005-01-01

    In any medical domain, it is common to have more than one test (classifier) to diagnose a disease. In image analysis, for example, there is often more than one reader or more than one algorithm applied to a certain data set. Combining of classifiers is often helpful, but determining the way in which classifiers should be combined is not trivial. Standard strategies are based on learning classifier combination functions from data. We describe a simple strategy to combine results from classifiers that have not been applied to a common data set, and therefore can not undergo this type of joint training. The strategy, which assumes conditional independence of classifiers, is based on the calculation of a combined Receiver Operating Characteristic (ROC) curve, using maximum likelihood analysis to determine a combination rule for each ROC operating point. We offer some insights into the use of ROC analysis in the field of medical imaging.

  12. Combining Classifiers Using Their Receiver Operating Characteristics and Maximum Likelihood Estimation*

    PubMed Central

    Haker, Steven; Wells, William M.; Warfield, Simon K.; Talos, Ion-Florin; Bhagwat, Jui G.; Goldberg-Zimring, Daniel; Mian, Asim; Ohno-Machado, Lucila; Zou, Kelly H.

    2010-01-01

    In any medical domain, it is common to have more than one test (classifier) to diagnose a disease. In image analysis, for example, there is often more than one reader or more than one algorithm applied to a certain data set. Combining of classifiers is often helpful, but determining the way in which classifiers should be combined is not trivial. Standard strategies are based on learning classifier combination functions from data. We describe a simple strategy to combine results from classifiers that have not been applied to a common data set, and therefore can not undergo this type of joint training. The strategy, which assumes conditional independence of classifiers, is based on the calculation of a combined Receiver Operating Characteristic (ROC) curve, using maximum likelihood analysis to determine a combination rule for each ROC operating point. We offer some insights into the use of ROC analysis in the field of medical imaging. PMID:16685884

  13. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.

    PubMed

    Rukhin, Andrew L

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.

  14. Geometrically nonlinear analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Dattaguru, B.; Everett, R. A., Jr.; Whitcomb, J. D.; Johnson, W. S.

    1982-01-01

    A geometrically nonlinear finite element analysis of cohesive failure in typical joints is presented. Cracked-lap-shear joints were chosen for analysis. Results obtained from linear and nonlinear analysis show that nonlinear effects, due to large rotations, significantly affect the calculated mode 1, crack opening, and mode 2, inplane shear, strain-energy-release rates. The ratio of the mode 1 to mode 2 strain-energy-relase rates (G1/G2) was found to be strongly affected by he adhesive modulus and the adherend thickness. The ratios between 0.2 and 0.8 can be obtained by varying adherend thickness and using either a single or double cracked-lap-shear specimen configuration. Debond growth rate data, together with the analysis, indicate that mode 1 strain-energy-release rate governs debond growth. Results from the present analysis agree well with experimentally measured joint opening displacements.

  15. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies

    PubMed Central

    Rukhin, Andrew L.

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed. PMID:26989583

  16. Elasto-Plastic Analysis of Tee Joints Using HOT-SMAC

    NASA Technical Reports Server (NTRS)

    Arnold, Steve M. (Technical Monitor); Bednarcyk, Brett A.; Yarrington, Phillip W.

    2004-01-01

    The Higher Order Theory - Structural/Micro Analysis Code (HOT-SMAC) software package is applied to analyze the linearly elastic and elasto-plastic response of adhesively bonded tee joints. Joints of this type are finding an increasing number of applications with the increased use of composite materials within advanced aerospace vehicles, and improved tools for the design and analysis of these joints are needed. The linearly elastic results of the code are validated vs. finite element analysis results from the literature under different loading and boundary conditions, and new results are generated to investigate the inelastic behavior of the tee joint. The comparison with the finite element results indicates that HOT-SMAC is an efficient and accurate alternative to the finite element method and has a great deal of potential as an analysis tool for a wide range of bonded joints.

  17. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  18. A Study of Item Bias for Attitudinal Measurement Using Maximum Likelihood Factor Analysis.

    ERIC Educational Resources Information Center

    Mayberry, Paul W.

    A technique for detecting item bias that is responsive to attitudinal measurement considerations is a maximum likelihood factor analysis procedure comparing multivariate factor structures across various subpopulations, often referred to as SIFASP. The SIFASP technique allows for factorial model comparisons in the testing of various hypotheses…

  19. Design, Static Analysis And Fabrication Of Composite Joints

    NASA Astrophysics Data System (ADS)

    Mathiselvan, G.; Gobinath, R.; Yuvaraja, S.; Raja, T.

    2017-05-01

    The Bonded joints will be having one of the important issues in the composite technology is the repairing of aging in aircraft applications. In these applications and also for joining various composite material parts together, the composite materials fastened together either using adhesives or mechanical fasteners. In this paper, we have carried out design, static analysis of 3-D models and fabrication of the composite joints (bonded, riveted and hybrid). The 3-D model of the composite structure will be fabricated by using the materials such as epoxy resin, glass fibre material and aluminium rivet for preparing the joints. The static analysis was carried out with different joint by using ANSYS software. After fabrication, parametric study was also conducted to compare the performance of the hybrid joint with varying adherent width, adhesive thickness and overlap length. Different joint and its materials tensile test result have compared.

  20. Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.

    PubMed

    Lohse, Konrad; Frantz, Laurent A F

    2014-04-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.

  1. A composite likelihood approach for spatially correlated survival data

    PubMed Central

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  2. A composite likelihood approach for spatially correlated survival data.

    PubMed

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  3. Analysis of a Preloaded Bolted Joint in a Ceramic Composite Combustor

    NASA Technical Reports Server (NTRS)

    Hissam, D. Andy; Bower, Mark V.

    2003-01-01

    This paper presents the detailed analysis of a preloaded bolted joint incorporating ceramic materials. The objective of this analysis is to determine the suitability of a joint design for a ceramic combustor. The analysis addresses critical factors in bolted joint design including preload, preload uncertainty, and load factor. The relationship between key joint variables is also investigated. The analysis is based on four key design criteria, each addressing an anticipated failure mode. The criteria are defined in terms of margin of safety, which must be greater than zero for the design criteria to be satisfied. Since the proposed joint has positive margins of safety, the design criteria are satisfied. Therefore, the joint design is acceptable.

  4. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    DOE PAGES

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less

  5. Likelihood ratio meta-analysis: New motivation and approach for an old method.

    PubMed

    Dormuth, Colin R; Filion, Kristian B; Platt, Robert W

    2016-03-01

    A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Structural analysis of Aircraft fuselage splice joint

    NASA Astrophysics Data System (ADS)

    Udaya Prakash, R.; Kumar, G. Raj; Vijayanandh, R.; Senthil Kumar, M.; Ramganesh, T.

    2016-09-01

    In Aviation sector, composite materials and its application to each component are one of the prime factors of consideration due to the high strength to weight ratio, design flexibility and non-corrosive so that the composite materials are widely used in the low weight constructions and also it can be treated as a suitable alternative to metals. The objective of this paper is to estimate and compare the suitability of a composite skin joint in an aircraft fuselage with different joints by simulating the displacement, normal stress, vonmises stress and shear stress with the help of numerical solution methods. The reference Z-stringer component of this paper is modeled by CATIA and numerical simulation is carried out by ANSYS has been used for splice joint presents in the aircraft fuselage with three combinations of joints such as riveted joint, bonded joint and hybrid joint. Nowadays the stringers are using to avoid buckling of fuselage skin, it has joined together by rivets and they are connected end to end by splice joint. Design and static analysis of three-dimensional models of joints such as bonded, riveted and hybrid are carried out and results are compared.

  7. Pseudo and conditional score approach to joint analysis of current count and current status data.

    PubMed

    Wen, Chi-Chung; Chen, Yi-Hau

    2018-04-17

    We develop a joint analysis approach for recurrent and nonrecurrent event processes subject to case I interval censorship, which are also known in literature as current count and current status data, respectively. We use a shared frailty to link the recurrent and nonrecurrent event processes, while leaving the distribution of the frailty fully unspecified. Conditional on the frailty, the recurrent event is assumed to follow a nonhomogeneous Poisson process, and the mean function of the recurrent event and the survival function of the nonrecurrent event are assumed to follow some general form of semiparametric transformation models. Estimation of the models is based on the pseudo-likelihood and the conditional score techniques. The resulting estimators for the regression parameters and the unspecified baseline functions are shown to be consistent with rates of square and cubic roots of the sample size, respectively. Asymptotic normality with closed-form asymptotic variance is derived for the estimator of the regression parameters. We apply the proposed method to a fracture-osteoporosis survey data to identify risk factors jointly for fracture and osteoporosis in elders, while accounting for association between the two events within a subject. © 2018, The International Biometric Society.

  8. Robust joint score tests in the application of DNA methylation data analysis.

    PubMed

    Li, Xuan; Fu, Yuejiao; Wang, Xiaogang; Qiu, Weiliang

    2018-05-18

    Recently differential variability has been showed to be valuable in evaluating the association of DNA methylation to the risks of complex human diseases. The statistical tests based on both differential methylation level and differential variability can be more powerful than those based only on differential methylation level. Anh and Wang (2013) proposed a joint score test (AW) to simultaneously detect for differential methylation and differential variability. However, AW's method seems to be quite conservative and has not been fully compared with existing joint tests. We proposed three improved joint score tests, namely iAW.Lev, iAW.BF, and iAW.TM, and have made extensive comparisons with the joint likelihood ratio test (jointLRT), the Kolmogorov-Smirnov (KS) test, and the AW test. Systematic simulation studies showed that: 1) the three improved tests performed better (i.e., having larger power, while keeping nominal Type I error rates) than the other three tests for data with outliers and having different variances between cases and controls; 2) for data from normal distributions, the three improved tests had slightly lower power than jointLRT and AW. The analyses of two Illumina HumanMethylation27 data sets GSE37020 and GSE20080 and one Illumina Infinium MethylationEPIC data set GSE107080 demonstrated that three improved tests had higher true validation rates than those from jointLRT, KS, and AW. The three proposed joint score tests are robust against the violation of normality assumption and presence of outlying observations in comparison with other three existing tests. Among the three proposed tests, iAW.BF seems to be the most robust and effective one for all simulated scenarios and also in real data analyses.

  9. Likelihood-Based Random-Effect Meta-Analysis of Binary Events.

    PubMed

    Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D

    2015-01-01

    Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.

  10. Assessing compatibility of direct detection data: halo-independent global likelihood analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelmini, Graciela B.; Huh, Ji-Haeng; Witte, Samuel J.

    2016-10-18

    We present two different halo-independent methods to assess the compatibility of several direct dark matter detection data sets for a given dark matter model using a global likelihood consisting of at least one extended likelihood and an arbitrary number of Gaussian or Poisson likelihoods. In the first method we find the global best fit halo function (we prove that it is a unique piecewise constant function with a number of down steps smaller than or equal to a maximum number that we compute) and construct a two-sided pointwise confidence band at any desired confidence level, which can then be comparedmore » with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a “constrained parameter goodness-of-fit” test statistic, whose p-value we then use to define a “plausibility region” (e.g. where p≥10%). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. p<10%). We illustrate these methods by applying them to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic spin-independent isospin-conserving interactions or exothermic spin-independent isospin-violating interactions.« less

  11. Quantum-state reconstruction by maximizing likelihood and entropy.

    PubMed

    Teo, Yong Siah; Zhu, Huangjun; Englert, Berthold-Georg; Řeháček, Jaroslav; Hradil, Zdeněk

    2011-07-08

    Quantum-state reconstruction on a finite number of copies of a quantum system with informationally incomplete measurements, as a rule, does not yield a unique result. We derive a reconstruction scheme where both the likelihood and the von Neumann entropy functionals are maximized in order to systematically select the most-likely estimator with the largest entropy, that is, the least-bias estimator, consistent with a given set of measurement data. This is equivalent to the joint consideration of our partial knowledge and ignorance about the ensemble to reconstruct its identity. An interesting structure of such estimators will also be explored.

  12. Robust Multi-Frame Adaptive Optics Image Restoration Algorithm Using Maximum Likelihood Estimation with Poisson Statistics.

    PubMed

    Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan

    2017-04-06

    An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods.

  13. Robust Multi-Frame Adaptive Optics Image Restoration Algorithm Using Maximum Likelihood Estimation with Poisson Statistics

    PubMed Central

    Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan

    2017-01-01

    An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods. PMID:28383503

  14. Maximum Likelihood Estimations and EM Algorithms with Length-biased Data

    PubMed Central

    Qin, Jing; Ning, Jing; Liu, Hao; Shen, Yu

    2012-01-01

    SUMMARY Length-biased sampling has been well recognized in economics, industrial reliability, etiology applications, epidemiological, genetic and cancer screening studies. Length-biased right-censored data have a unique data structure different from traditional survival data. The nonparametric and semiparametric estimations and inference methods for traditional survival data are not directly applicable for length-biased right-censored data. We propose new expectation-maximization algorithms for estimations based on full likelihoods involving infinite dimensional parameters under three settings for length-biased data: estimating nonparametric distribution function, estimating nonparametric hazard function under an increasing failure rate constraint, and jointly estimating baseline hazards function and the covariate coefficients under the Cox proportional hazards model. Extensive empirical simulation studies show that the maximum likelihood estimators perform well with moderate sample sizes and lead to more efficient estimators compared to the estimating equation approaches. The proposed estimates are also more robust to various right-censoring mechanisms. We prove the strong consistency properties of the estimators, and establish the asymptotic normality of the semi-parametric maximum likelihood estimators under the Cox model using modern empirical processes theory. We apply the proposed methods to a prevalent cohort medical study. Supplemental materials are available online. PMID:22323840

  15. Analysis of continuous beams with joint slip

    Treesearch

    L. A. Soltis

    1981-01-01

    A computer analysis with user guidelines to analyze partially continuous multi-span beams is presented. Partial continuity is due to rotational slip which occurs at spliced joints at the supports of continuous beams such as floor joists. Beam properties, loads, and joint slip are input; internal forces, reactions, and deflections are output.

  16. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  17. Neandertal Admixture in Eurasia Confirmed by Maximum-Likelihood Analysis of Three Genomes

    PubMed Central

    Lohse, Konrad; Frantz, Laurent A. F.

    2014-01-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4−7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination. PMID:24532731

  18. Excellent AUC for joint fluid cytology in the detection/exclusion of hip and knee prosthetic joint infection.

    PubMed

    Gallo, Jiri; Juranova, Jarmila; Svoboda, Michal; Zapletalova, Jana

    2017-09-01

    The aim of this study was to evaluate the characteristics of synovial fluid (SF) white cell count (SWCC) and neutrophil/lymphocyte percentage in the diagnosis of prosthetic joint infection (PJI) for particular threshold values. This was a prospective study of 391 patients in whom SF specimens were collected before total joint replacement revisions. SF was aspirated before joint capsule incision. The PJI diagnosis was based only on non-SF data. Receiver operating characteristic plots were constructed for the SWCC and differential counts of leukocytes in aspirated fluid. Logistic binomic regression was used to distinguish infected and non-infected cases in the combined data. PJI was diagnosed in 78 patients, and aseptic revision in 313 patients. The areas (AUC) under the curve for the SWCC, the neutrophil and lymphocyte percentages were 0.974, 0.962, and 0.951, respectively. The optimal cut-off for PJI was 3,450 cells/μL, 74.6% neutrophils, and 14.6% lymphocytes. Positive likelihood ratios for the SWCC, neutrophil and lymphocyte percentages were 19.0, 10.4, and 9.5, respectively. Negative likelihood ratios for the SWCC, neutrophil and lymphocyte percentages were 0.06, 0.076, and 0.092, respectively. Based on AUC, the present study identified cut-off values for the SWCC and differential leukocyte count for the diagnosis of PJI. The likelihood ratio for positive/negative SWCCs can significantly change the pre-test probability of PJI.

  19. A concise evidence-based physical examination for diagnosis of acromioclavicular joint pathology: a systematic review.

    PubMed

    Krill, Michael K; Rosas, Samuel; Kwon, KiHyun; Dakkak, Andrew; Nwachukwu, Benedict U; McCormick, Frank

    2018-02-01

    The clinical examination of the shoulder joint is an undervalued diagnostic tool for evaluating acromioclavicular (AC) joint pathology. Applying evidence-based clinical tests enables providers to make an accurate diagnosis and minimize costly imaging procedures and potential delays in care. The purpose of this study was to create a decision tree analysis enabling simple and accurate diagnosis of AC joint pathology. A systematic review of the Medline, Ovid and Cochrane Review databases was performed to identify level one and two diagnostic studies evaluating clinical tests for AC joint pathology. Individual test characteristics were combined in series and in parallel to improve sensitivities and specificities. A secondary analysis utilized subjective pre-test probabilities to create a clinical decision tree algorithm with post-test probabilities. The optimal special test combination to screen and confirm AC joint pathology combined Paxinos sign and O'Brien's Test, with a specificity of 95.8% when performed in series; whereas, Paxinos sign and Hawkins-Kennedy Test demonstrated a sensitivity of 93.7% when performed in parallel. Paxinos sign and O'Brien's Test demonstrated the greatest positive likelihood ratio (2.71); whereas, Paxinos sign and Hawkins-Kennedy Test reported the lowest negative likelihood ratio (0.35). No combination of special tests performed in series or in parallel creates more than a small impact on post-test probabilities to screen or confirm AC joint pathology. Paxinos sign and O'Brien's Test is the only special test combination that has a small and sometimes important impact when used both in series and in parallel. Physical examination testing is not beneficial for diagnosis of AC joint pathology when pretest probability is unequivocal. In these instances, it is of benefit to proceed with procedural tests to evaluate AC joint pathology. Ultrasound-guided corticosteroid injections are diagnostic and therapeutic. An ultrasound-guided AC joint

  20. Likelihood Analysis of the Minimal AMSB Model

    DOE PAGES

    Bagnaschi, E.; Borsato, M.; Sakurai, K.; ...

    2017-04-27

    We perform a likelihood analysis of the minimal Anomaly-Mediated Supersymmetry Breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that a wino-like or a Higgsino-like neutralino LSP,more » $$m_{\\tilde \\chi^0_{1}}$$, may provide the cold dark matter (DM) with similar likelihood. The upper limit on the DM density from Planck and other experiments enforces $$m_{\\tilde \\chi^0_{1}} \\lesssim 3~TeV$$ after the inclusion of Sommerfeld enhancement in its annihilations. If most of the cold DM density is provided by the $$\\tilde \\chi_0^1$$, the measured value of the Higgs mass favours a limited range of $$\\tan \\beta \\sim 5$$ (or for $$\\mu > 0$$, $$\\tan \\beta \\sim 45$$) but the scalar mass $$m_0$$ is poorly constrained. In the wino-LSP case, $$m_{3/2}$$ is constrained to about $900~TeV$ and $${m_{\\tilde \\chi^0_{1}}}$$ to $$2.9\\pm0.1~TeV$$, whereas in the Higgsino-LSP case $$m_{3/2}$$ has just a lower limit $$\\gtrsim 650TeV$$ ($$\\gtrsim 480TeV$$) and $$m_{\\tilde \\chi^0_{1}}$$ is constrained to $$1.12 ~(1.13) \\pm0.02~TeV$$ in the $$\\mu>0$$ ($$\\mu<0$$) scenario. In neither case can the anomalous magnetic moment of the muon, $${(g-2)_\\mu}$$, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, {though there} are some prospects for direct DM detection. On the other hand, if the $${m_{\\tilde \\chi^0_{1}}}$$ contributes only a fraction of the cold DM density, {future LHC $$E_T$$-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant}, and interference effects enable $${\\rm BR}(B_{s, d} \\to \\mu^+\\mu^-)$$ to agree with the data better than in the SM in the case of wino-like DM with $$\\mu > 0$$.« less

  1. On the Likelihood Ratio Test for the Number of Factors in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.; Yuan, Ke-Hai

    2007-01-01

    In the exploratory factor analysis, when the number of factors exceeds the true number of factors, the likelihood ratio test statistic no longer follows the chi-square distribution due to a problem of rank deficiency and nonidentifiability of model parameters. As a result, decisions regarding the number of factors may be incorrect. Several…

  2. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  3. Joint attention in Down syndrome: A meta-analysis.

    PubMed

    Hahn, Laura J; Loveall, Susan J; Savoy, Madison T; Neumann, Allie M; Ikuta, Toshikazu

    2018-07-01

    Some studies have indicated that joint attention may be a relative strength in Down syndrome (DS), but other studies have not. To conduct a meta-analysis of joint attention in DS to more conclusively determine if this is a relative strength or weakness when compared to children with typical development (TD), developmental disabilities (DD), and autism spectrum disorder (ASD). Journal articles published before September 13, 2016, were identified by using the search terms "Down syndrome" and "joint attention" or "coordinating attention". Identified studies were reviewed and coded for inclusion criteria, descriptive information, and outcome variables. Eleven studies (553 participants) met inclusion criteria. Children with DS showed similar joint attention as TD children and higher joint attention than children with DD and ASD. Meta-regression revealed a significant association between age and joint attention effect sizes in the DS vs. TD contrast. Joint attention appears to not be a weakness for children with DS, but may be commensurate with developmental level. Joint attention may be a relative strength in comparison to other skills associated with the DS behavioral phenotype. Early interventions for children with DS may benefit from leveraging joint attention skills. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Likelihoods for fixed rank nomination networks

    PubMed Central

    HOFF, PETER; FOSDICK, BAILEY; VOLFOVSKY, ALEX; STOVEL, KATHERINE

    2014-01-01

    Many studies that gather social network data use survey methods that lead to censored, missing, or otherwise incomplete information. For example, the popular fixed rank nomination (FRN) scheme, often used in studies of schools and businesses, asks study participants to nominate and rank at most a small number of contacts or friends, leaving the existence of other relations uncertain. However, most statistical models are formulated in terms of completely observed binary networks. Statistical analyses of FRN data with such models ignore the censored and ranked nature of the data and could potentially result in misleading statistical inference. To investigate this possibility, we compare Bayesian parameter estimates obtained from a likelihood for complete binary networks with those obtained from likelihoods that are derived from the FRN scheme, and therefore accommodate the ranked and censored nature of the data. We show analytically and via simulation that the binary likelihood can provide misleading inference, particularly for certain model parameters that relate network ties to characteristics of individuals and pairs of individuals. We also compare these different likelihoods in a data analysis of several adolescent social networks. For some of these networks, the parameter estimates from the binary and FRN likelihoods lead to different conclusions, indicating the importance of analyzing FRN data with a method that accounts for the FRN survey design. PMID:25110586

  5. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  6. Hurdle models for multilevel zero-inflated data via h-likelihood.

    PubMed

    Molas, Marek; Lesaffre, Emmanuel

    2010-12-30

    Count data often exhibit overdispersion. One type of overdispersion arises when there is an excess of zeros in comparison with the standard Poisson distribution. Zero-inflated Poisson and hurdle models have been proposed to perform a valid likelihood-based analysis to account for the surplus of zeros. Further, data often arise in clustered, longitudinal or multiple-membership settings. The proper analysis needs to reflect the design of a study. Typically random effects are used to account for dependencies in the data. We examine the h-likelihood estimation and inference framework for hurdle models with random effects for complex designs. We extend the h-likelihood procedures to fit hurdle models, thereby extending h-likelihood to truncated distributions. Two applications of the methodology are presented. Copyright © 2010 John Wiley & Sons, Ltd.

  7. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-01-01

    Contents include the following: JWST/ISIM introduction. Design and analysis challenges for ISIM bonded joints. JWST/ISIM joint designs. Bonded joint analysis. Finite element modeling. Failure criteria and margin calculation. Analysis/test correlation procedure. Example of test data and analysis.

  8. Augmented reality environment for temporomandibular joint motion analysis.

    PubMed

    Wagner, A; Ploder, O; Zuniga, J; Undt, G; Ewers, R

    1996-01-01

    The principles of interventional video tomography were applied for the real-time visualization of temporomandibular joint movements in an augmented reality environment. Anatomic structures were extracted in three dimensions from planar cephalometric radiographic images. The live-image fusion of these graphic anatomic structures with real-time position data of the mandible and the articular fossa was performed with a see-through, head-mounted display and an electromagnetic tracking system. The dynamic fusion of radiographic images of the temporomandibular joint to anatomic temporomandibular joint structures in motion created a new modality for temporomandibular joint motion analysis. The advantages of the method are its ability to accurately examine the motion of the temporomandibular joint in three dimensions without restraining the subject and its ability to simultaneously determine the relationship of the bony temporomandibular joint and supporting structures (ie, occlusion, muscle function, etc) during movement before and after treatment.

  9. Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Clayton, J. Louie

    2001-01-01

    This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.

  10. Determining the accuracy of maximum likelihood parameter estimates with colored residuals

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Klein, Vladislav

    1994-01-01

    An important part of building high fidelity mathematical models based on measured data is calculating the accuracy associated with statistical estimates of the model parameters. Indeed, without some idea of the accuracy of parameter estimates, the estimates themselves have limited value. In this work, an expression based on theoretical analysis was developed to properly compute parameter accuracy measures for maximum likelihood estimates with colored residuals. This result is important because experience from the analysis of measured data reveals that the residuals from maximum likelihood estimation are almost always colored. The calculations involved can be appended to conventional maximum likelihood estimation algorithms. Simulated data runs were used to show that the parameter accuracy measures computed with this technique accurately reflect the quality of the parameter estimates from maximum likelihood estimation without the need for analysis of the output residuals in the frequency domain or heuristically determined multiplication factors. The result is general, although the application studied here is maximum likelihood estimation of aerodynamic model parameters from flight test data.

  11. Maximum Likelihood Analysis of Low Energy CDMS II Germanium Data

    DOE PAGES

    Agnese, R.

    2015-03-30

    We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from Pb210decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we also perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in ourmore » data. Finally, we confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.« less

  12. Stochastic capture zone analysis of an arsenic-contaminated well using the generalized likelihood uncertainty estimator (GLUE) methodology

    NASA Astrophysics Data System (ADS)

    Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro

    2003-06-01

    In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.

  13. Likelihood analysis of supersymmetric SU(5) GUTs

    DOE PAGES

    Bagnaschi, Emanuele; Costa, J. C.; Sakurai, K.; ...

    2017-02-16

    Here, we perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino massmore » $$m_{1/2}$$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $$m_5$$ and $$m_{10}$$, and for the $$\\mathbf{5}$$ and $$\\mathbf{\\bar 5}$$ Higgs representations $$m_{H_u}$$ and $$m_{H_d}$$, a universal trilinear soft SUSY-breaking parameter $$A_0$$, and the ratio of Higgs vevs $$\\tan \\beta$$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel $${\\tilde u_R}/{\\tilde c_R} - \\tilde{\\chi}^0_1$$ coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of $${\\tilde \

  14. Novel joint cupping clinical maneuver for ultrasonographic detection of knee joint effusions.

    PubMed

    Uryasev, Oleg; Joseph, Oliver C; McNamara, John P; Dallas, Apostolos P

    2013-11-01

    Knee effusions occur due to traumatic and atraumatic causes. Clinical diagnosis currently relies on several provocative techniques to demonstrate knee joint effusions. Portable bedside ultrasonography (US) is becoming an adjunct to diagnosis of effusions. We hypothesized that a US approach with a clinical joint cupping maneuver increases sensitivity in identifying effusions as compared to US alone. Using unembalmed cadaver knees, we injected fluid to create effusions up to 10 mL. Each effusion volume was measured in a lateral transverse location with respect to the patella. For each effusion we applied a joint cupping maneuver from an inferior approach, and re-measured the effusion. With increased volume of saline infusion, the mean depth of effusion on ultrasound imaging increased as well. Using a 2-mm cutoff, we visualized an effusion without the joint cupping maneuver at 2.5 mL and with the joint cupping technique at 1 mL. Mean effusion diameter increased on average 0.26 cm for the joint cupping maneuver as compared to without the maneuver. The effusion depth was statistically different at 2.5 and 7.5 mL (P < .05). Utilizing a joint cupping technique in combination with US is a valuable tool in assessing knee effusions, especially those of subclinical levels. Effusion measurements are complicated by uneven distribution of effusion fluid. A clinical joint cupping maneuver concentrates the fluid in one recess of the joint, increasing the likelihood of fluid detection using US. © 2013 Elsevier Inc. All rights reserved.

  15. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    PubMed

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and

  16. Accurate Structural Correlations from Maximum Likelihood Superpositions

    PubMed Central

    Theobald, Douglas L; Wuttke, Deborah S

    2008-01-01

    The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR) models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA) of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method (“PCA plots”) for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology. PMID:18282091

  17. Hip and knee joints are more stabilized than driven during the stance phase of gait: an analysis of the 3D angle between joint moment and joint angular velocity.

    PubMed

    Dumas, R; Cheze, L

    2008-08-01

    Joint power is commonly used in orthopaedics, ergonomics or sports analysis but its clinical interpretation remains controversial. Some basic principles on muscle actions and energy transfer have been proposed in 2D. The decomposition of power on 3 axes, although questionable, allows the same analysis in 3D. However, these basic principles have been widely criticized, mainly because bi-articular muscles must be considered. This requires a more complex computation in order to determine how the individual muscle force contributes to drive the joint. Conversely, with simple 3D inverse dynamics, the analysis of both joint moment and angular velocity directions is essential to clarify when the joint moment can contribute or not to drive the joint. The present study evaluates the 3D angle between the joint moment and the joint angular velocity and investigates when the hip, knee and ankle joints are predominantly driven (angle close to 0 degrees and 180 degrees ) or stabilized (angle close to 90 degrees ) during gait. The 3D angle curves show that the three joints are never fully but only partially driven and that the hip and knee joints are mainly stabilized during the stance phase. The notion of stabilization should be further investigated, especially for subjects with motion disorders or prostheses.

  18. Do-it-yourself statistics: A computer-assisted likelihood approach to analysis of data from genetic crosses.

    PubMed Central

    Robbins, L G

    2000-01-01

    Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965

  19. A retrospective likelihood approach for efficient integration of multiple omics factors in case-control association studies.

    PubMed

    Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine

    2015-03-01

    Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data. © 2015 Wiley Periodicals, Inc.

  20. FE analysis of SMA-based bio-inspired bone-joint system

    NASA Astrophysics Data System (ADS)

    Yang, S.; Seelecke, S.

    2009-10-01

    This paper presents the finite element (FE) analysis of a bio-inspired bone-joint system. Motivated by the BATMAV project, which aims at the development of a micro-air-vehicle platform that implements bat-like flapping flight capabilities, we study the actuation of a typical elbow joint, using shape memory alloy (SMA) in a dual manner. Micro-scale martensitic SMA wires are used as 'metal muscles' to actuate a system of humerus, elbow joint and radius, in concert with austenitic wires, which operate as flexible joints due to their superelastic character. For the FE analysis, the humerus and radius are modeled as standard elastic beams, while the elbow joint and muscle wires use the Achenbach-Muller-Seelecke SMA model as beams and cable elements, respectively. The particular focus of the paper is on the implementation of the above SMA model in COMSOL.

  1. Robust Likelihoods for Inflationary Gravitational Waves from Maps of Cosmic Microwave Background Polarization

    NASA Technical Reports Server (NTRS)

    Switzer, Eric Ryan; Watts, Duncan J.

    2016-01-01

    The B-mode polarization of the cosmic microwave background provides a unique window into tensor perturbations from inflationary gravitational waves. Survey effects complicate the estimation and description of the power spectrum on the largest angular scales. The pixel-space likelihood yields parameter distributions without the power spectrum as an intermediate step, but it does not have the large suite of tests available to power spectral methods. Searches for primordial B-modes must rigorously reject and rule out contamination. Many forms of contamination vary or are uncorrelated across epochs, frequencies, surveys, or other data treatment subsets. The cross power and the power spectrum of the difference of subset maps provide approaches to reject and isolate excess variance. We develop an analogous joint pixel-space likelihood. Contamination not modeled in the likelihood produces parameter-dependent bias and complicates the interpretation of the difference map. We describe a null test that consistently weights the difference map. Excess variance should either be explicitly modeled in the covariance or be removed through reprocessing the data.

  2. Jointly modeling longitudinal proportional data and survival times with an application to the quality of life data in a breast cancer trial.

    PubMed

    Song, Hui; Peng, Yingwei; Tu, Dongsheng

    2017-04-01

    Motivated by the joint analysis of longitudinal quality of life data and recurrence free survival times from a cancer clinical trial, we present in this paper two approaches to jointly model the longitudinal proportional measurements, which are confined in a finite interval, and survival data. Both approaches assume a proportional hazards model for the survival times. For the longitudinal component, the first approach applies the classical linear mixed model to logit transformed responses, while the second approach directly models the responses using a simplex distribution. A semiparametric method based on a penalized joint likelihood generated by the Laplace approximation is derived to fit the joint model defined by the second approach. The proposed procedures are evaluated in a simulation study and applied to the analysis of breast cancer data motivated this research.

  3. Working on a Standard Joint Unit: A pilot test.

    PubMed

    Casajuana, Cristina; López-Pelayo, Hugo; Mercedes Balcells, María; Miquel, Laia; Teixidó, Lídia; Colom, Joan; Gual, Antoni

    2017-09-29

    Assessing cannabis consumption remains complex due to no reliable registration systems. We tested the likelihood of establishing a Standard Joint Unit (SJU) which considers the main cannabinoids with implication on health through a naturalistic approach.  Methodology. Pilot study with current cannabis users of four areas of Barcelona: universities, nightclubs, out-patient mental health service, and cannabis associations. We designed and administered a questionnaire on cannabis use-patterns and determined the willingness to donate a joint for analysis. Descriptive statistics were used to analyze the data. Forty volunteers answered the questionnaire (response rate 95%); most of them were men (72.5%) and young adults (median age 24.5 years; IQR 8.75 years) who consume daily or nearly daily (70%). Most participants consume marihuana (85%) and roll their joints with a median of 0.25 gr of marihuana. Two out of three (67.5%) stated they were willing to donate a joint. Obtaining an SJU with the planned methodology has proved to be feasible. Pre-testing resulted in an improvement of the questionnaire and retribution to incentivize donations. Establishing an SJU is essential to improve our knowledge on cannabis-related outcomes.

  4. Neural Networks Involved in Adolescent Reward Processing: An Activation Likelihood Estimation Meta-Analysis of Functional Neuroimaging Studies

    PubMed Central

    Silverman, Merav H.; Jedd, Kelly; Luciana, Monica

    2015-01-01

    Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587

  5. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  6. Specifying the core network supporting episodic simulation and episodic memory by activation likelihood estimation

    PubMed Central

    Benoit, Roland G.; Schacter, Daniel L.

    2015-01-01

    It has been suggested that the simulation of hypothetical episodes and the recollection of past episodes are supported by fundamentally the same set of brain regions. The present article specifies this core network via Activation Likelihood Estimation (ALE). Specifically, a first meta-analysis revealed joint engagement of core network regions during episodic memory and episodic simulation. These include parts of the medial surface, the hippocampus and parahippocampal cortex within the medial temporal lobes, and the lateral temporal and inferior posterior parietal cortices on the lateral surface. Both capacities also jointly recruited additional regions such as parts of the bilateral dorsolateral prefrontal cortex. All of these core regions overlapped with the default network. Moreover, it has further been suggested that episodic simulation may require a stronger engagement of some of the core network’s nodes as wells as the recruitment of additional brain regions supporting control functions. A second ALE meta-analysis indeed identified such regions that were consistently more strongly engaged during episodic simulation than episodic memory. These comprised the core-network clusters located in the left dorsolateral prefrontal cortex and posterior inferior parietal lobe and other structures distributed broadly across the default and fronto-parietal control networks. Together, the analyses determine the set of brain regions that allow us to experience past and hypothetical episodes, thus providing an important foundation for studying the regions’ specialized contributions and interactions. PMID:26142352

  7. Maximum likelihood decoding analysis of accumulate-repeat-accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, A.; Divsalar, D.; Yao, K.

    2004-01-01

    In this paper, the performance of the repeat-accumulate codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. Some simple codes are shown that perform very close to Shannon limit with maximum likelihood decoding.

  8. Collinear Latent Variables in Multilevel Confirmatory Factor Analysis: A Comparison of Maximum Likelihood and Bayesian Estimations.

    PubMed

    Can, Seda; van de Schoot, Rens; Hox, Joop

    2015-06-01

    Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation coefficient (ICC) and estimation method; maximum likelihood estimation with robust chi-squares and standard errors and Bayesian estimation, on the convergence rate are investigated. The other variables of interest were rate of inadmissible solutions and the relative parameter and standard error bias on the between level. The results showed that inadmissible solutions were obtained when there was between level collinearity and the estimation method was maximum likelihood. In the within level multicollinearity condition, all of the solutions were admissible but the bias values were higher compared with the between level collinearity condition. Bayesian estimation appeared to be robust in obtaining admissible parameters but the relative bias was higher than for maximum likelihood estimation. Finally, as expected, high ICC produced less biased results compared to medium ICC conditions.

  9. Neutron Tomography of a Fuel Cell: Statistical Learning Implementation of a Penalized Likelihood Method

    NASA Astrophysics Data System (ADS)

    Coakley, Kevin J.; Vecchia, Dominic F.; Hussey, Daniel S.; Jacobson, David L.

    2013-10-01

    At the NIST Neutron Imaging Facility, we collect neutron projection data for both the dry and wet states of a Proton-Exchange-Membrane (PEM) fuel cell. Transmitted thermal neutrons captured in a scintillator doped with lithium-6 produce scintillation light that is detected by an amorphous silicon detector. Based on joint analysis of the dry and wet state projection data, we reconstruct a residual neutron attenuation image with a Penalized Likelihood method with an edge-preserving Huber penalty function that has two parameters that control how well jumps in the reconstruction are preserved and how well noisy fluctuations are smoothed out. The choice of these parameters greatly influences the resulting reconstruction. We present a data-driven method that objectively selects these parameters, and study its performance for both simulated and experimental data. Before reconstruction, we transform the projection data so that the variance-to-mean ratio is approximately one. For both simulated and measured projection data, the Penalized Likelihood method reconstruction is visually sharper than a reconstruction yielded by a standard Filtered Back Projection method. In an idealized simulation experiment, we demonstrate that the cross validation procedure selects regularization parameters that yield a reconstruction that is nearly optimal according to a root-mean-square prediction error criterion.

  10. Design/analysis of the JWST ISIM bonded joints for survivability at cryogenic temperatures

    NASA Astrophysics Data System (ADS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-08-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite adhesively bonded joints at the cryogenic temperature of 30K (-405°F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to hybrid composite tubes (75mm square) made with M55J/954-6 and T300/954-6 prepregs. Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently, the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  11. Thermographic Analysis of Stress Distribution in Welded Joints

    NASA Astrophysics Data System (ADS)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  12. The likelihood ratio as a random variable for linked markers in kinship analysis.

    PubMed

    Egeland, Thore; Slooten, Klaas

    2016-11-01

    The likelihood ratio is the fundamental quantity that summarizes the evidence in forensic cases. Therefore, it is important to understand the theoretical properties of this statistic. This paper is the last in a series of three, and the first to study linked markers. We show that for all non-inbred pairwise kinship comparisons, the expected likelihood ratio in favor of a type of relatedness depends on the allele frequencies only via the number of alleles, also for linked markers, and also if the true relationship is another one than is tested for by the likelihood ratio. Exact expressions for the expectation and variance are derived for all these cases. Furthermore, we show that the expected likelihood ratio is a non-increasing function if the recombination rate increases between 0 and 0.5 when the actual relationship is the one investigated by the LR. Besides being of theoretical interest, exact expressions such as obtained here can be used for software validation as they allow to verify the correctness up to arbitrary precision. The paper also presents results and advice of practical importance. For example, we argue that the logarithm of the likelihood ratio behaves in a fundamentally different way than the likelihood ratio itself in terms of expectation and variance, in agreement with its interpretation as weight of evidence. Equipped with the results presented and freely available software, one may check calculations and software and also do power calculations.

  13. Evaluation of traction stirrup distraction technique to increase the joint space of the shoulder joint in the dog: A cadaveric study.

    PubMed

    Devesa, V; Rovesti, G L; Urrutia, P G; Sanroman, F; Rodriguez-Quiros, J

    2015-06-01

    The objective of this study was to evaluate technical feasibility and efficacy of a joint distraction technique by traction stirrup to facilitate shoulder arthroscopy and assess potential soft tissue damage. Twenty shoulders were evaluated radiographically before distraction. Distraction was applied with loads from 40 N up to 200 N, in 40 N increments, and the joint space was recorded at each step by radiographic images. The effects of joint flexion and intra-articular air injection at maximum load were evaluated. Radiographic evaluation was performed after distraction to evaluate ensuing joint laxity. Joint distraction by traction stirrup technique produces a significant increase in the joint space; an increase in joint laxity could not be inferred by standard and stress radiographs. However, further clinical studies are required to evaluate potential neurovascular complications. A wider joint space may be useful to facilitate arthroscopy, reducing the likelihood for iatrogenic damage to intra-articular structures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. An analysis of crash likelihood : age versus driving experience

    DOT National Transportation Integrated Search

    1995-05-01

    The study was designed to determine the crash likelihood of drivers in Michigan as a function of two independent variables: driver age and driving experience. The age variable had eight levels (18, 19, 20, 21, 22, 23, 24, and 25 years old) and the ex...

  15. Design and Performance Analysis of a new Rotary Hydraulic Joint

    NASA Astrophysics Data System (ADS)

    Feng, Yong; Yang, Junhong; Shang, Jianzhong; Wang, Zhuo; Fang, Delei

    2017-07-01

    To improve the driving torque of the robots joint, a wobble plate hydraulic joint is proposed, and the structure and working principle are described. Then mathematical models of kinematics and dynamics was established. On the basis of this, dynamic simulation and characteristic analysis are carried out. Results show that the motion curve of the joint is continuous and the impact is small. Moreover the output torque of the joint characterized by simple structure and easy processing is large and can be rotated continuously.

  16. Experimental and failure analysis of the prosthetic finger joint implants

    NASA Astrophysics Data System (ADS)

    Naidu, Sanjiv H.

    Small joint replacement arthroplasty of the hand is a well accepted surgical procedure to restore function and cosmesis in an individual with a crippled hand. Silicone elastomers have been used as prosthetic material in various small hand joints for well over three decades. Although the clinical science aspects of silicone elastomer failure are well known, the physical science aspects of prosthetic failure are scant and vague. In the following thesis, using both an animal model, and actual retrieved specimens which have failed in human service, experimental and failure analysis of silicone finger joints are presented. Fractured surfaces of retrieved silicone trapezial implants, and silicone finger joint implants were studied with both FESEM and SEM; the mode of failure for silicone trapezium is by wear polishing, whereas the finger joint implants failed either by fatigue fracture or tearing of the elastomer, or a combination of both. Thermal analysis revealed that the retrieved elastomer implants maintained its viscoelastic properties throughout the service period. In order to provide for a more functional and physiologic arthroplasty a novel finger joint (Rolamite prosthesis) is proposed using more recently developed thermoplastic polymers. The following thesis also addresses the outcome of the experimental studies of the Rolamite prosthesis in a rabbit animal model, in addition to the failure analysis of the thermoplastic polymers while in service in an in vivo synovial environment. Results of retrieved Rolamite specimens suggest that the use for thermoplastic elastomers such as block copolymer based elastomers in a synovial environment such as a mammalian joint may very well be limited.

  17. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Univariate and bivariate likelihood-based meta-analysis methods performed comparably when marginal sensitivity and specificity were the targets of inference.

    PubMed

    Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H

    2017-03-01

    To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and

  19. Maximum likelihood solution for inclination-only data in paleomagnetism

    NASA Astrophysics Data System (ADS)

    Arason, P.; Levi, S.

    2010-08-01

    We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.

  20. Model criticism based on likelihood-free inference, with an application to protein network evolution.

    PubMed

    Ratmann, Oliver; Andrieu, Christophe; Wiuf, Carsten; Richardson, Sylvia

    2009-06-30

    Mathematical models are an important tool to explain and comprehend complex phenomena, and unparalleled computational advances enable us to easily explore them without any or little understanding of their global properties. In fact, the likelihood of the data under complex stochastic models is often analytically or numerically intractable in many areas of sciences. This makes it even more important to simultaneously investigate the adequacy of these models-in absolute terms, against the data, rather than relative to the performance of other models-but no such procedure has been formally discussed when the likelihood is intractable. We provide a statistical interpretation to current developments in likelihood-free Bayesian inference that explicitly accounts for discrepancies between the model and the data, termed Approximate Bayesian Computation under model uncertainty (ABCmicro). We augment the likelihood of the data with unknown error terms that correspond to freely chosen checking functions, and provide Monte Carlo strategies for sampling from the associated joint posterior distribution without the need of evaluating the likelihood. We discuss the benefit of incorporating model diagnostics within an ABC framework, and demonstrate how this method diagnoses model mismatch and guides model refinement by contrasting three qualitative models of protein network evolution to the protein interaction datasets of Helicobacter pylori and Treponema pallidum. Our results make a number of model deficiencies explicit, and suggest that the T. pallidum network topology is inconsistent with evolution dominated by link turnover or lateral gene transfer alone.

  1. Joint multifractal analysis based on wavelet leaders

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  2. Joint Blind Source Separation by Multi-set Canonical Correlation Analysis

    PubMed Central

    Li, Yi-Ou; Adalı, Tülay; Wang, Wei; Calhoun, Vince D

    2009-01-01

    In this work, we introduce a simple and effective scheme to achieve joint blind source separation (BSS) of multiple datasets using multi-set canonical correlation analysis (M-CCA) [1]. We first propose a generative model of joint BSS based on the correlation of latent sources within and between datasets. We specify source separability conditions, and show that, when the conditions are satisfied, the group of corresponding sources from each dataset can be jointly extracted by M-CCA through maximization of correlation among the extracted sources. We compare source separation performance of the M-CCA scheme with other joint BSS methods and demonstrate the superior performance of the M-CCA scheme in achieving joint BSS for a large number of datasets, group of corresponding sources with heterogeneous correlation values, and complex-valued sources with circular and non-circular distributions. We apply M-CCA to analysis of functional magnetic resonance imaging (fMRI) data from multiple subjects and show its utility in estimating meaningful brain activations from a visuomotor task. PMID:20221319

  3. A maximum likelihood analysis of the CoGeNT public dataset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelso, Chris, E-mail: ckelso@unf.edu

    The CoGeNT detector, located in the Soudan Underground Laboratory in Northern Minnesota, consists of a 475 grams (fiducial mass of 330 grams) target mass of p-type point contact germanium detector that measures the ionization charge created by nuclear recoils. This detector has searched for recoils created by dark matter since December of 2009. We analyze the public dataset from the CoGeNT experiment to search for evidence of dark matter interactions with the detector. We perform an unbinned maximum likelihood fit to the data and compare the significance of different WIMP hypotheses relative to each other and the null hypothesis ofmore » no WIMP interactions. This work presents the current status of the analysis.« less

  4. Joint Sequence Analysis: Association and Clustering

    ERIC Educational Resources Information Center

    Piccarreta, Raffaella

    2017-01-01

    In its standard formulation, sequence analysis aims at finding typical patterns in a set of life courses represented as sequences. Recently, some proposals have been introduced to jointly analyze sequences defined on different domains (e.g., work career, partnership, and parental histories). We introduce measures to evaluate whether a set of…

  5. Specifying the core network supporting episodic simulation and episodic memory by activation likelihood estimation.

    PubMed

    Benoit, Roland G; Schacter, Daniel L

    2015-08-01

    It has been suggested that the simulation of hypothetical episodes and the recollection of past episodes are supported by fundamentally the same set of brain regions. The present article specifies this core network via Activation Likelihood Estimation (ALE). Specifically, a first meta-analysis revealed joint engagement of expected core-network regions during episodic memory and episodic simulation. These include parts of the medial surface, the hippocampus and parahippocampal cortex within the medial temporal lobes, and the temporal and inferior posterior parietal cortices on the lateral surface. Both capacities also jointly recruited additional regions such as parts of the bilateral dorsolateral prefrontal cortex. All of these core regions overlapped with the default network. Moreover, it has further been suggested that episodic simulation may require a stronger engagement of some of the core network's nodes as well as the recruitment of additional brain regions supporting control functions. A second ALE meta-analysis indeed identified such regions that were consistently more strongly engaged during episodic simulation than episodic memory. These comprised the core-network clusters located in the left dorsolateral prefrontal cortex and posterior inferior parietal lobe and other structures distributed broadly across the default and fronto-parietal control networks. Together, the analyses determine the set of brain regions that allow us to experience past and hypothetical episodes, thus providing an important foundation for studying the regions' specialized contributions and interactions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Multiset singular value decomposition for joint analysis of multi-modal data: application to fingerprint analysis

    NASA Astrophysics Data System (ADS)

    Emge, Darren K.; Adalı, Tülay

    2014-06-01

    As the availability and use of imaging methodologies continues to increase, there is a fundamental need to jointly analyze data that is collected from multiple modalities. This analysis is further complicated when, the size or resolution of the images differ, implying that the observation lengths of each of modality can be highly varying. To address this expanding landscape, we introduce the multiset singular value decomposition (MSVD), which can perform a joint analysis on any number of modalities regardless of their individual observation lengths. Through simulations, the inter modal relationships across the different modalities which are revealed by the MSVD are shown. We apply the MSVD to forensic fingerprint analysis, showing that MSVD joint analysis successfully identifies relevant similarities for further analysis, significantly reducing the processing time required. This reduction, takes this technique from a laboratory method to a useful forensic tool with applications across the law enforcement and security regimes.

  7. Joint analysis of BICEP2/keck array and Planck Data.

    PubMed

    Ade, P A R; Aghanim, N; Ahmed, Z; Aikin, R W; Alexander, K D; Arnaud, M; Aumont, J; Baccigalupi, C; Banday, A J; Barkats, D; Barreiro, R B; Bartlett, J G; Bartolo, N; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Benton, S J; Bernard, J-P; Bersanelli, M; Bielewicz, P; Bischoff, C A; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Brevik, J A; Bucher, M; Buder, I; Bullock, E; Burigana, C; Butler, R C; Buza, V; Calabrese, E; Cardoso, J-F; Catalano, A; Challinor, A; Chary, R-R; Chiang, H C; Christensen, P R; Colombo, L P L; Combet, C; Connors, J; Couchot, F; Coulais, A; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Delouis, J-M; Désert, F-X; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dowell, C D; Duband, L; Ducout, A; Dunkley, J; Dupac, X; Dvorkin, C; Efstathiou, G; Elsner, F; Enßlin, T A; Eriksen, H K; Falgarone, E; Filippini, J P; Finelli, F; Fliescher, S; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Frejsel, A; Galeotta, S; Galli, S; Ganga, K; Ghosh, T; Giard, M; Gjerløw, E; Golwala, S R; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Halpern, M; Hansen, F K; Hanson, D; Harrison, D L; Hasselfield, M; Helou, G; Henrot-Versillé, S; Herranz, D; Hildebrandt, S R; Hilton, G C; Hivon, E; Hobson, M; Holmes, W A; Hovest, W; Hristov, V V; Huffenberger, K M; Hui, H; Hurier, G; Irwin, K D; Jaffe, A H; Jaffe, T R; Jewell, J; Jones, W C; Juvela, M; Karakci, A; Karkare, K S; Kaufman, J P; Keating, B G; Kefeli, S; Keihänen, E; Kernasovskiy, S A; Keskitalo, R; Kisner, T S; Kneissl, R; Knoche, J; Knox, L; Kovac, J M; Krachmalnicoff, N; Kunz, M; Kuo, C L; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J-M; Lasenby, A; Lattanzi, M; Lawrence, C R; Leitch, E M; Leonardi, R; Levrier, F; Lewis, A; Liguori, M; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Lueker, M; Macías-Pérez, J F; Maffei, B; Maino, D; Mandolesi, N; Mangilli, A; Maris, M; Martin, P G; Martínez-González, E; Masi, S; Mason, P; Matarrese, S; Megerian, K G; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschênes, M-A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Netterfield, C B; Nguyen, H T; Nørgaard-Nielsen, H U; Noviello, F; Novikov, D; Novikov, I; O'Brient, R; Ogburn, R W; Orlando, A; Pagano, L; Pajot, F; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, T J; Perdereau, O; Perotto, L; Pettorino, V; Piacentini, F; Piat, M; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Pratt, G W; Prunet, S; Pryke, C; Puget, J-L; Rachen, J P; Reach, W T; Rebolo, R; Reinecke, M; Remazeilles, M; Renault, C; Renzi, A; Richter, S; Ristorcelli, I; Rocha, G; Rossetti, M; Roudier, G; Rowan-Robinson, M; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Savelainen, M; Savini, G; Schwarz, R; Scott, D; Seiffert, M D; Sheehy, C D; Spencer, L D; Staniszewski, Z K; Stolyarov, V; Sudiwala, R; Sunyaev, R; Sutton, D; Suur-Uski, A-S; Sygnet, J-F; Tauber, J A; Teply, G P; Terenzi, L; Thompson, K L; Toffolatti, L; Tolan, J E; Tomasi, M; Tristram, M; Tucci, M; Turner, A D; Valenziano, L; Valiviita, J; Van Tent, B; Vibert, L; Vielva, P; Vieregg, A G; Villa, F; Wade, L A; Wandelt, B D; Watson, R; Weber, A C; Wehus, I K; White, M; White, S D M; Willmert, J; Wong, C L; Yoon, K W; Yvon, D; Zacchei, A; Zonca, A

    2015-03-13

    We report the results of a joint analysis of data from BICEP2/Keck Array and Planck. BICEP2 and Keck Array have observed the same approximately 400  deg^{2} patch of sky centered on RA 0 h, Dec. -57.5°. The combined maps reach a depth of 57 nK deg in Stokes Q and U in a band centered at 150 GHz. Planck has observed the full sky in polarization at seven frequencies from 30 to 353 GHz, but much less deeply in any given region (1.2  μK deg in Q and U at 143 GHz). We detect 150×353 cross-correlation in B modes at high significance. We fit the single- and cross-frequency power spectra at frequencies ≥150  GHz to a lensed-ΛCDM model that includes dust and a possible contribution from inflationary gravitational waves (as parametrized by the tensor-to-scalar ratio r), using a prior on the frequency spectral behavior of polarized dust emission from previous Planck analysis of other regions of the sky. We find strong evidence for dust and no statistically significant evidence for tensor modes. We probe various model variations and extensions, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint. Finally, we present an alternative analysis which is similar to a map-based cleaning of the dust contribution, and show that this gives similar constraints. The final result is expressed as a likelihood curve for r, and yields an upper limit r_{0.05}<0.12 at 95% confidence. Marginalizing over dust and r, lensing B modes are detected at 7.0σ significance.

  8. Experimental measurement and modeling analysis on mechanical properties of incudostapedial joint

    PubMed Central

    Zhang, Xiangming

    2011-01-01

    The incudostapedial (IS) joint between the incus and stapes is a synovial joint consisting of joint capsule, cartilage, and synovial fluid. The mechanical properties of the IS joint directly affect the middle ear transfer function for sound transmission. However, due to the complexity and small size of the joint, the mechanical properties of the IS joint have not been reported in the literature. In this paper, we report our current study on mechanical properties of human IS joint using both experimental measurement and finite element (FE) modeling analysis. Eight IS joint samples with the incus and stapes attached were harvested from human cadaver temporal bones. Tension, compression, stress relaxation and failure tests were performed on those samples in a micro-material testing system. An analytical approach with the hyperelastic Ogden model and a 3D FE model of the IS joint including the cartilage, joint capsule, and synovial fluid were employed to derive mechanical parameters of the IS joint. The comparison of measurements and modeling results reveals the relationship between the mechanical properties and structure of the IS joint. PMID:21061141

  9. Experimental measurement and modeling analysis on mechanical properties of incudostapedial joint.

    PubMed

    Zhang, Xiangming; Gan, Rong Z

    2011-10-01

    The incudostapedial (IS) joint between the incus and stapes is a synovial joint consisting of joint capsule, cartilage, and synovial fluid. The mechanical properties of the IS joint directly affect the middle ear transfer function for sound transmission. However, due to the complexity and small size of the joint, the mechanical properties of the IS joint have not been reported in the literature. In this paper, we report our current study on mechanical properties of human IS joint using both experimental measurement and finite element (FE) modeling analysis. Eight IS joint samples with the incus and stapes attached were harvested from human cadaver temporal bones. Tension, compression, stress relaxation and failure tests were performed on those samples in a micro-material testing system. An analytical approach with the hyperelastic Ogden model and a 3D FE model of the IS joint including the cartilage, joint capsule, and synovial fluid were employed to derive mechanical parameters of the IS joint. The comparison of measurements and modeling results reveals the relationship between the mechanical properties and structure of the IS joint.

  10. Analysis of Bonded Joints Between the Facesheet and Flange of Corrugated Composite Panels

    NASA Technical Reports Server (NTRS)

    Yarrington, Phillip W.; Collier, Craig S.; Bednarcyk, Brett A.

    2008-01-01

    This paper outlines a method for the stress analysis of bonded composite corrugated panel facesheet to flange joints. The method relies on the existing HyperSizer Joints software, which analyzes the bonded joint, along with a beam analogy model that provides the necessary boundary loading conditions to the joint analysis. The method is capable of predicting the full multiaxial stress and strain fields within the flange to facesheet joint and thus can determine ply-level margins and evaluate delamination. Results comparing the method to NASTRAN finite element model stress fields are provided illustrating the accuracy of the method.

  11. Finite Element Analysis of the Maximum Stress at the Joints of the Transmission Tower

    NASA Astrophysics Data System (ADS)

    Itam, Zarina; Beddu, Salmia; Liyana Mohd Kamal, Nur; Bamashmos, Khaled H.

    2016-03-01

    Transmission towers are tall structures, usually a steel lattice tower, used to support an overhead power line. Usually, transmission towers are analyzed as frame-truss systems and the members are assumed to be pin-connected without explicitly considering the effects of joints on the tower behavior. In this research, an engineering example of joint will be analyzed with the consideration of the joint detailing to investigate how it will affect the tower analysis. A static analysis using STAAD Pro was conducted to indicate the joint with the maximum stress. This joint will then be explicitly analyzed in ANSYS using the Finite Element Method. Three approaches were used in the software which are the simple plate model, bonded contact with no bolts, and beam element bolts. Results from the joint analysis show that stress values increased with joint details consideration. This proves that joints and connections play an important role in the distribution of stress within the transmission tower.

  12. Evaluation of joint findings with gait analysis in children with hemophilia.

    PubMed

    Cayir, Atilla; Yavuzer, Gunes; Sayli, Revide Tülin; Gurcay, Eda; Culha, Vildan; Bozkurt, Murat

    2014-01-01

    Hemophilic arthropathy due to recurrent joint bleeding leads to physical, psychological and socioeconomic problems in children with hemophilia and reduces their quality of life. The purpose of this study was to evaluate joint damage through various parameters and to determine functional deterioration in the musculoskeletal system during walking using kinetic and kinematic gait analysis. Physical examination and kinetic and kinematic gait analysis findings of 19 hemophilic patients aged 7-20 years were compared with those of age, sex and leg length matched controls. Stride time was longer in the hemophilia group (p=0.001) compared to the age matched healthy control group, while hip, knee and ankle joint rotation angles were more limited (p=0.001, p=0.035 and p=0.001, respectively). In the hemophilia group, the extensor moment of the knee joint in the stance phase was less than that in the control group (p=0.001). Stride time was longer in the severe hemophilia group compared to the mild-moderate hemophilia and control groups (p=0.011 and p=0.001, respectively). Rotation angle of the ankle was wider in the control group compared to the other two groups (p=0.001 for both). Rotation angle of the ankle joint was narrower in the severe hemophilia group compared to the others (p=0.001 for each). Extensor moment of the knee joint was greater in the control group compared to the other two groups (p=0.003 and p=0.001, respectively). Walking velocity was higher in the control group compared to the severe hemophilia group. Kinetic and kinematic gait analysis has the sensitivity to detect minimal changes in biomechanical parameters. Gait analysis can be used as a reliable method to detect early joint damage.

  13. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China.

    PubMed

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-07

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  14. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China

    PubMed Central

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-01-01

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide. PMID:27713530

  15. Stress analysis of bolted joints under centrifugal force

    NASA Astrophysics Data System (ADS)

    Imura, Makoto; Iizuka, Motonobu; Nakae, Shigeki; Mori, Takeshi; Koyama, Takayuki

    2014-06-01

    Our objective is to develop a long-life rotary machine for synchronous generators and motors. To do this, it is necessary to design a high-strength bolted joint, which is responsible for fixing a salient pole on a rotor shaft. While the rotary machine is in operation, not only centrifugal force but also moment are loaded on a bolted joint, because a point of load is eccentric to a centre of a bolt. We tried to apply the theory proposed in VDI2230-Blatt1 to evaluate the bolted joint under eccentric force, estimate limited centrifugal force, which is the cause of partial separation between the pole and the rotor shaft, and then evaluate additional tension of a bolt after the partial separation has occurred. We analyzed the bolted joint by FEM, and defined load introduction factor in that case. Additionally, we investigated the effect of the variation of bolt preload on the partial separation. We did a full scale experiment with a prototype rotor to reveal the variation of bolt preload against tightening torque. After that, we verified limited centrifugal force and the strength of the bolted joint by the VDI2230-Blatt1 theory and FEM considering the variation of bolt preload. Finally, we could design a high-strength bolted joint verified by the theoretical study and FEM analysis.

  16. Planning and conducting medical support to joint operations.

    PubMed

    Hughes, A S

    2000-01-01

    Operations are core business for all of us and the PJHQ medical cell is at the heart of this process. With the likelihood of a continuing UK presence in the Balkans for some time to come, the challenge of meeting this and any other new operational commitments will continue to demand a flexible and innovative approach from all concerned. These challenges together with the Joint and multinational aspects of the job make the PJHQ medical cell a demanding but rewarding place to work and provide a valuable Joint staff training opportunity for the RNMS.

  17. Dynamic analysis of clamp band joint system subjected to axial vibration

    NASA Astrophysics Data System (ADS)

    Qin, Z. Y.; Yan, S. Z.; Chu, F. L.

    2010-10-01

    Clamp band joints are commonly used for connecting circular components together in industry. Some of the systems jointed by clamp band are subjected to dynamic load. However, very little research on the dynamic characteristics for this kind of joint can be found in the literature. In this paper, a dynamic model for clamp band joint system is developed. Contact and frictional slip between the components are accommodated in this model. Nonlinear finite element analysis is conducted to identify the model parameters. Then static experiments are carried out on a scaled model of the clamp band joint to validate the joint model. Finally, the model is adopted to study the dynamic characteristics of the clamp band joint system subjected to axial harmonic excitation and the effects of the wedge angle of the clamp band joint and the preload on the response. The model proposed in this paper can represent the nonlinearity of the clamp band joint and be used conveniently to investigate the effects of the structural and loading parameters on the dynamic characteristics of this type of joint system.

  18. Coupled Viscous Fluid Flow and Joint Deformation Analysis for Grout Injection in a Rock Joint

    NASA Astrophysics Data System (ADS)

    Kim, Hyung-Mok; Lee, Jong-Won; Yazdani, Mahmoud; Tohidi, Elham; Nejati, Hamid Reza; Park, Eui-Seob

    2018-02-01

    Fluid flow modeling is a major area of interest within the field of rock mechanics. The main objective of this study is to gain insight into the performance of grout injection inside jointed rock masses by numerical modeling of grout flow through a single rock joint. Grout flow has been widely simulated using non-Newtonian Bingham fluid characterized by two main parameters of dynamic viscosity and shear yield strength both of which are time dependent. The increasing value of these properties with injection time will apparently affect the parameters representing the grouting performance including grout penetration length and volumetric injection rate. In addition, through hydromechanical coupling a mutual influence between the injection pressure from the one side and the joint opening/closing behavior and the aperture profile variation on the other side is anticipated. This is capable of producing a considerable impact on grout spread within the rock joints. In this study based on the Bingham fluid model, a series of numerical analysis has been conducted using UDEC to simulate the flow of viscous grout in a single rock joint with smooth parallel surfaces. In these analyses, the time-dependent evolution of the grout fluid properties and the hydromechanical coupling have been considered to investigate their impact on grouting performance. In order to verify the validity of these simulations, the results of analyses including the grout penetration length and the injection flow rate were compared with a well-known analytical solution which is available for the simple case of constant grout properties and non-coupled hydraulic analysis. The comparison demonstrated that the grout penetration length can be overestimated when the time-dependent hardening of grout material is not considered. Moreover, due to the HM coupling, it was shown that the joint opening induced by injection pressure may have a considerable increasing impression on the values of penetration length and

  19. Assessment of articular disc displacement of temporomandibular joint with ultrasound.

    PubMed

    Razek, Ahmed Abdel Khalek Abdel; Al Mahdy Al Belasy, Fouad; Ahmed, Wael Mohamed Said; Haggag, Mai Ahmed

    2015-06-01

    To assess pattern of articular disc displacement in patients with internal derangement (ID) of temporomandibular joint (TMJ) with ultrasound. Prospective study was conducted upon 40 TMJ of 20 patients (3 male, 17 female with mean age of 26.1 years) with ID of TMJ. They underwent high-resolution ultrasound and MR imaging of TMJ. The MR images were used as the gold standard for calculating sensitivity, specificity, accuracy, positive predictive value (PPV), negative predictive value (NPV), positive likelihood ratio (PLR), and negative likelihood ratio (NLR) of ultrasound for diagnosis of anterior or sideway displacement of the disc. The anterior displaced disc was seen in 26 joints at MR and 22 joints at ultrasound. The diagnostic efficacy of ultrasound for anterior displacement has sensitivity of 79.3 %, specificity of 72.7 %, accuracy of 77.5 %, PPV of 88.5 %, NPV of 57.1 %, PLR of 2.9 and NLR of 0.34. The sideway displacement of disc was seen in four joints at MR and three joints at ultrasound. The diagnostic efficacy of ultrasound for sideway displacement has a sensitivity of 75 %, specificity of 63.6 %, accuracy of 66.7 %, PPV of 42.8, NPV of 87.5 %, PLR of 2.06, and NLR of 0.39. We concluded that ultrasound is a non-invasive imaging modality used for assessment of anterior and sideway displacement of the articular disc in patients with ID of TMJ.

  20. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    PubMed

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Acoustic emission analysis: A test method for metal joints bonded by adhesives

    NASA Technical Reports Server (NTRS)

    Brockmann, W.; Fischer, T.

    1978-01-01

    Acoustic emission analysis is applied to study adhesive joints which had been subjected to mechanical and climatic stresses, taking into account conditions which make results applicable to adhesive joints used in aerospace technology. Specimens consisting of the alloy AlMgSi0.5 were used together with a phenolic resin adhesive, an epoxy resin modified with a polyamide, and an epoxy resin modified with a nitrile. Results show that the acoustic emission analysis provides valuable information concerning the behavior of adhesive joints under load and climatic stresses.

  2. The Analysis of Adhesively Bonded Advanced Composite Joints Using Joint Finite Elements

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott E.; Waas, Anthony M.

    2012-01-01

    The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various

  3. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    A space suit's mobility is critical to an astronaut's ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. Mobility can be broken down into two parts: range of motion (ROM) and torque. These two measurements describe how the suit moves and how much force it takes to move. Two methods were chosen to define mobility requirements for the Constellation Space Suit Element (CSSE). One method focuses on range of motion and the second method centers on joint torque. A joint torque test was conducted to determine a baseline for current advanced space suit joint torques. This test utilized the following space suits: Extravehicular Mobility Unit (EMU), Advanced Crew Escape Suit (ACES), I-Suit, D-Suit, Enhanced Mobility (EM)- ACES, and Mark III (MK-III). Data was collected data from 16 different joint movements of each suit. The results were then reviewed and CSSE joint torque requirement values were selected. The focus of this paper is to discuss trends observed during data analysis.

  4. Ankle joint function during walking in tophaceous gout: A biomechanical gait analysis study.

    PubMed

    Carroll, Matthew; Boocock, Mark; Dalbeth, Nicola; Stewart, Sarah; Frampton, Christopher; Rome, Keith

    2018-04-17

    The foot and ankle are frequently affected in tophaceous gout, yet kinematic and kinetic changes in this region during gait are unknown. The aim of the study was to evaluate ankle biomechanical characteristics in people with tophaceous gout using three-dimensional gait analysis. Twenty-four participants with tophaceous gout were compared with 24 age-and sex-matched control participants. A 9-camera motion analysis system and two floor-mounted force plates were used to calculate kinematic and kinetic parameters. Peak ankle joint angular velocity was significantly decreased in participants with gout (P < 0.01). No differences were found for ankle ROM in either the sagittal (P = 0.43) or frontal planes (P = 0.08). No differences were observed between groups for peak ankle joint power (P = 0.41), peak ankle joint force (P = 0.25), peak ankle joint moment (P = 0.16), timing for peak ankle joint force (P = 0.81), or timing for peak ankle joint moment (P = 0.16). Three dimensional gait analysis demonstrated that ankle joint function does not change in people with gout. People with gout demonstrated a reduced peak ankle joint angular velocity which may reflect gait-limiting factors and adaptations from the high levels of foot pain, impairment and disability experienced by this population. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. The Maximum Likelihood Solution for Inclination-only Data

    NASA Astrophysics Data System (ADS)

    Arason, P.; Levi, S.

    2006-12-01

    frequently for certain data sets, and relatively small perturbations in the data will drive the maxima to the boundary. We interpret this to indicate that, for such data sets, the information needed to separate the mean inclination and the precision parameter is permanently lost. To assess the reliability and accuracy of our method we generated large number of random Fisher-distributed data sets and used seven methods to estimate the mean inclination and precision paramenter. These comparisons are described by Levi and Arason at the 2006 AGU Fall meeting. The results of the various methods is very favourable to our new robust maximum likelihood method, which, on average, is the most reliable, and the mean inclination estimates are the least biased toward shallow values. Further information on our inclination-only analysis can be obtained from: http://www.vedur.is/~arason/paleomag

  6. Biomechanical analysis of the influence of friction in jaw joint disorders.

    PubMed

    Koolstra, J H

    2012-01-01

    Increased friction due to impaired lubrication in the jaw joint has been considered as one of the possible causes for internal joint disorders. A very common internal disorder in the jaw joint is an anteriorly dislocated articular disc. This is generally considered to contribute to the onset of arthritic injuries. Increase of friction as caused by impairment of lubrication is suspected to be a possible cause for such a disorder. The influence of friction was addressed by analysis of its effects on tensions and deformations of the cartilaginous structures in the jaw joint using computational biomechanical analysis. Jaw open-close movements were simulated while in one or two compartments of the right joint friction was applied in the articular contact. The left joint was treated as the healthy control. The simulations predicted that friction primarily causes increased shear stress in the articular cartilage layers, but hardly in the articular disc. This suggests that impaired lubrication may facilitate deterioration of the cartilage-subchondral bone unit of the articular surfaces. The results further suggest that increased friction is not a plausible cause for turning a normally functioning articular disc into an anteriorly dislocated one. Copyright © 2011 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  7. An Activation Likelihood Estimation Meta-Analysis Study of Simple Motor Movements in Older and Young Adults

    PubMed Central

    Turesky, Ted K.; Turkeltaub, Peter E.; Eden, Guinevere F.

    2016-01-01

    The functional neuroanatomy of finger movements has been characterized with neuroimaging in young adults. However, less is known about the aging motor system. Several studies have contrasted movement-related activity in older versus young adults, but there is inconsistency among their findings. To address this, we conducted an activation likelihood estimation (ALE) meta-analysis on within-group data from older adults and young adults performing regularly paced right-hand finger movement tasks in response to external stimuli. We hypothesized that older adults would show a greater likelihood of activation in right cortical motor areas (i.e., ipsilateral to the side of movement) compared to young adults. ALE maps were examined for conjunction and between-group differences. Older adults showed overlapping likelihoods of activation with young adults in left primary sensorimotor cortex (SM1), bilateral supplementary motor area, bilateral insula, left thalamus, and right anterior cerebellum. Their ALE map differed from that of the young adults in right SM1 (extending into dorsal premotor cortex), right supramarginal gyrus, medial premotor cortex, and right posterior cerebellum. The finding that older adults uniquely use ipsilateral regions for right-hand finger movements and show age-dependent modulations in regions recruited by both age groups provides a foundation by which to understand age-related motor decline and motor disorders. PMID:27799910

  8. Influence of solder joint length to the mechanical aspect during the thermal stress analysis

    NASA Astrophysics Data System (ADS)

    Tan, J. S.; Khor, C. Y.; Rahim, Wan Mohd Faizal Wan Abd; Ishak, Muhammad Ikman; Rosli, M. U.; Jamalludin, Mohd Riduan; Zakaria, M. S.; Nawi, M. A. M.; Aziz, M. S. Abdul; Ani, F. Che

    2017-09-01

    Solder joint is an important interconnector in surface mount technology (SMT) assembly process. The real time stress, strain and displacement of the solder joint is difficult to observe and assess the experiment. To tackle these problems, simulation analysis was employed to study the von Mises stress, strain and displacement in the thermal stress analysis by using Finite element based software. In this study, a model of leadless electronic package was considered. The thermal stress analysis was performed to investigate the effect of the solder length to those mechanical aspects. The simulation results revealed that solder length gives significant effect to the maximum von Mises stress to the solder joint. Besides, changes in solder length also influence the displacement of the solder joint in the thermal environment. The increment of the solder length significantly reduces the von Mises stress and strain on the solder joint. Thus, the understanding of the physical parameter for solder joint is important for engineer prior to designing the solder joint of the electronic component.

  9. Kinematics Simulation Analysis of Packaging Robot with Joint Clearance

    NASA Astrophysics Data System (ADS)

    Zhang, Y. W.; Meng, W. J.; Wang, L. Q.; Cui, G. H.

    2018-03-01

    Considering the influence of joint clearance on the motion error, repeated positioning accuracy and overall position of the machine, this paper presents simulation analysis of a packaging robot — 2 degrees of freedom(DOF) planar parallel robot based on the characteristics of high precision and fast speed of packaging equipment. The motion constraint equation of the mechanism is established, and the analysis and simulation of the motion error are carried out in the case of turning the revolute clearance. The simulation results show that the size of the joint clearance will affect the movement accuracy and packaging efficiency of the packaging robot. The analysis provides a reference point of view for the packaging equipment design and selection criteria and has a great significance on the packaging industry automation.

  10. Modeling gene expression measurement error: a quasi-likelihood approach

    PubMed Central

    Strimmer, Korbinian

    2003-01-01

    Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution) or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale). Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood). Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic) variance structure of the data. As the quasi-likelihood behaves (almost) like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye) effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also improved the power of

  11. Maximum likelihood-based analysis of single-molecule photon arrival trajectories

    NASA Astrophysics Data System (ADS)

    Hajdziona, Marta; Molski, Andrzej

    2011-02-01

    In this work we explore the statistical properties of the maximum likelihood-based analysis of one-color photon arrival trajectories. This approach does not involve binning and, therefore, all of the information contained in an observed photon strajectory is used. We study the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion and the Bayesian information criterion (BIC) in selecting the true kinetic model. We focus on the low excitation regime where photon trajectories can be modeled as realizations of Markov modulated Poisson processes. The number of observed photons is the key parameter in determining model selection and parameter estimation. For example, the BIC can select the true three-state model from competing two-, three-, and four-state kinetic models even for relatively short trajectories made up of 2 × 103 photons. When the intensity levels are well-separated and 104 photons are observed, the two-state model parameters can be estimated with about 10% precision and those for a three-state model with about 20% precision.

  12. Maximum likelihood-based analysis of single-molecule photon arrival trajectories.

    PubMed

    Hajdziona, Marta; Molski, Andrzej

    2011-02-07

    In this work we explore the statistical properties of the maximum likelihood-based analysis of one-color photon arrival trajectories. This approach does not involve binning and, therefore, all of the information contained in an observed photon strajectory is used. We study the accuracy and precision of parameter estimates and the efficiency of the Akaike information criterion and the Bayesian information criterion (BIC) in selecting the true kinetic model. We focus on the low excitation regime where photon trajectories can be modeled as realizations of Markov modulated Poisson processes. The number of observed photons is the key parameter in determining model selection and parameter estimation. For example, the BIC can select the true three-state model from competing two-, three-, and four-state kinetic models even for relatively short trajectories made up of 2 × 10(3) photons. When the intensity levels are well-separated and 10(4) photons are observed, the two-state model parameters can be estimated with about 10% precision and those for a three-state model with about 20% precision.

  13. JOINT AND INDIVIDUAL VARIATION EXPLAINED (JIVE) FOR INTEGRATED ANALYSIS OF MULTIPLE DATA TYPES.

    PubMed

    Lock, Eric F; Hoadley, Katherine A; Marron, J S; Nobel, Andrew B

    2013-03-01

    Research in several fields now requires the analysis of datasets in which multiple high-dimensional types of data are available for a common set of objects. In particular, The Cancer Genome Atlas (TCGA) includes data from several diverse genomic technologies on the same cancerous tumor samples. In this paper we introduce Joint and Individual Variation Explained (JIVE), a general decomposition of variation for the integrated analysis of such datasets. The decomposition consists of three terms: a low-rank approximation capturing joint variation across data types, low-rank approximations for structured variation individual to each data type, and residual noise. JIVE quantifies the amount of joint variation between data types, reduces the dimensionality of the data, and provides new directions for the visual exploration of joint and individual structure. The proposed method represents an extension of Principal Component Analysis and has clear advantages over popular two-block methods such as Canonical Correlation Analysis and Partial Least Squares. A JIVE analysis of gene expression and miRNA data on Glioblastoma Multiforme tumor samples reveals gene-miRNA associations and provides better characterization of tumor types.

  14. Preliminary Design and Analysis of an In-plane PRSEUS Joint

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Poplawski, Steven

    2013-01-01

    As part of the National Aeronautics and Space Administration's (NASA's) Environmentally Responsible Aviation (ERA) program, the Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) has been designed, developed and tested. However, PRSEUS development efforts to date have only addressed joints required to transfer bending moments between PRSEUS panels. Development of in-plane joints for the PRSEUS concept is necessary to facilitate in-plane transfer of load from PRSEUS panels to an adjacent structure, such as from a wing panel into a fuselage. This paper presents preliminary design and analysis of an in-plane PRSEUS joint for connecting PRSEUS panels at the termination of the rod-stiffened stringers. Design requirements are provided, the PRSEUS blade joint concept is presented, and preliminary design changes and analyses are carried out to examine the feasibility of the proposed in-plane PRSEUS blade joint. The study conducted herein focuses mainly on the PRSEUS structure on one side of the joint. In particular, the design requirements for the rod shear stress and bolt bearing stress are examined. A PRSEUS blade joint design was developed that demonstrates the feasibility of this in-plane PRSEUS joint concept to terminate the rod-stiffened stringers. The presented design only demonstrates feasibility, therefore, some areas of refinement are presented that would lead to a more optimum and realistic design.

  15. SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements

    NASA Technical Reports Server (NTRS)

    Knox, E. C.; Woods, G. Hamilton

    1991-01-01

    Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.

  16. Glutamate receptor-channel gating. Maximum likelihood analysis of gigaohm seal recordings from locust muscle.

    PubMed Central

    Bates, S E; Sansom, M S; Ball, F G; Ramsey, R L; Usherwood, P N

    1990-01-01

    Gigaohm recordings have been made from glutamate receptor channels in excised, outside-out patches of collagenase-treated locust muscle membrane. The channels in the excised patches exhibit the kinetic state switching first seen in megaohm recordings from intact muscle fibers. Analysis of channel dwell time distributions reveals that the gating mechanism contains at least four open states and at least four closed states. Dwell time autocorrelation function analysis shows that there are at least three gateways linking the open states of the channel with the closed states. A maximum likelihood procedure has been used to fit six different gating models to the single channel data. Of these models, a cooperative model yields the best fit, and accurately predicts most features of the observed channel gating kinetics. PMID:1696510

  17. Computational Modelling and Movement Analysis of Hip Joint with Muscles

    NASA Astrophysics Data System (ADS)

    Siswanto, W. A.; Yoon, C. C.; Salleh, S. Md.; Ngali, M. Z.; Yusup, Eliza M.

    2017-01-01

    In this study, the model of hip joint and the main muscles are modelled by finite elements. The parts included in the model are hip joint, hemi pelvis, gluteus maximus, quadratus femoris and gamellus inferior. The materials that used in these model are isotropic elastic, Mooney Rivlin and Neo-hookean. The hip resultant force of the normal gait and stair climbing are applied on the model of hip joint. The responses of displacement, stress and strain of the muscles are then recorded. FEBio non-linear solver for biomechanics is employed to conduct the simulation of the model of hip joint with muscles. The contact interfaces that used in this model are sliding contact and tied contact. From the analysis results, the gluteus maximus has the maximum displacement, stress and strain in the stair climbing. Quadratus femoris and gamellus inferior has the maximum displacement and strain in the normal gait however the maximum stress in the stair climbing. Besides that, the computational model of hip joint with muscles is produced for research and investigation platform. The model can be used as a visualization platform of hip joint.

  18. Exact Calculation of the Joint Allele Frequency Spectrum for Isolation with Migration Models.

    PubMed

    Kern, Andrew D; Hey, Jody

    2017-09-01

    Population genomic datasets collected over the past decade have spurred interest in developing methods that can utilize massive numbers of loci for inference of demographic and selective histories of populations. The allele frequency spectrum (AFS) provides a convenient statistic for such analysis, and, accordingly, much attention has been paid to predicting theoretical expectations of the AFS under a number of different models. However, to date, exact solutions for the joint AFS of two or more populations under models of migration and divergence have not been found. Here, we present a novel Markov chain representation of the coalescent on the state space of the joint AFS that allows for rapid, exact calculation of the joint AFS under isolation with migration (IM) models. In turn, we show how our Markov chain method, in the context of composite likelihood estimation, can be used for accurate inference of parameters of the IM model using SNP data. Lastly, we apply our method to recent whole genome datasets from African Drosophila melanogaster . Copyright © 2017 Kern and Hey.

  19. Biomechanical study of tarsometatarsal joint fusion using finite element analysis.

    PubMed

    Wang, Yan; Li, Zengyong; Zhang, Ming

    2014-11-01

    Complications of surgeries in foot and ankle bring patients with severe sufferings. Sufficient understanding of the internal biomechanical information such as stress distribution, contact pressure, and deformation is critical to estimate the effectiveness of surgical treatments and avoid complications. Foot and ankle is an intricate and synergetic system, and localized intervention may alter the functions to the adjacent components. The aim of this study was to estimate biomechanical effects of the TMT joint fusion using comprehensive finite element (FE) analysis. A foot and ankle model consists of 28 bones, 72 ligaments, and plantar fascia with soft tissues embracing all the segments. Kinematic information and ground reaction force during gait were obtained from motion analysis. Three gait instants namely the first peak, second peak and mid-stance were simulated in a normal foot and a foot with TMT joint fusion. It was found that contact pressure on plantar foot increased by 0.42%, 19% and 37%, respectively after TMT fusion compared with normal foot walking. Navico-cuneiform and fifth meta-cuboid joints sustained 27% and 40% increase in contact pressure at second peak, implying potential risk of joint problems such as arthritis. Von Mises stress in the second metatarsal bone increased by 22% at midstance, making it susceptible to stress fracture. This study provides biomechanical information for understanding the possible consequences of TMT joint fusion. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  20. Parameter estimation in astronomy through application of the likelihood ratio. [satellite data analysis techniques

    NASA Technical Reports Server (NTRS)

    Cash, W.

    1979-01-01

    Many problems in the experimental estimation of parameters for models can be solved through use of the likelihood ratio test. Applications of the likelihood ratio, with particular attention to photon counting experiments, are discussed. The procedures presented solve a greater range of problems than those currently in use, yet are no more difficult to apply. The procedures are proved analytically, and examples from current problems in astronomy are discussed.

  1. Combining evidence using likelihood ratios in writer verification

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Kovalenko, Dimitry; Tang, Yi; Ball, Gregory

    2013-01-01

    Forensic identification is the task of determining whether or not observed evidence arose from a known source. It involves determining a likelihood ratio (LR) - the ratio of the joint probability of the evidence and source under the identification hypothesis (that the evidence came from the source) and under the exclusion hypothesis (that the evidence did not arise from the source). In LR- based decision methods, particularly handwriting comparison, a variable number of input evidences is used. A decision based on many pieces of evidence can result in nearly the same LR as one based on few pieces of evidence. We consider methods for distinguishing between such situations. One of these is to provide confidence intervals together with the decisions and another is to combine the inputs using weights. We propose a new method that generalizes the Bayesian approach and uses an explicitly defined discount function. Empirical evaluation with several data sets including synthetically generated ones and handwriting comparison shows greater flexibility of the proposed method.

  2. How much to trust the senses: Likelihood learning

    PubMed Central

    Sato, Yoshiyuki; Kording, Konrad P.

    2014-01-01

    Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of prior-likelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood. PMID:25398975

  3. Joint source based analysis of multiple brain structures in studying major depressive disorder

    NASA Astrophysics Data System (ADS)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  4. Do depression and anxiety reduce the likelihood of remission in rheumatoid arthritis and psoriatic arthritis? Data from the prospective multicentre NOR-DMARD study.

    PubMed

    Michelsen, Brigitte; Kristianslund, Eirik Klami; Sexton, Joseph; Hammer, Hilde Berner; Fagerli, Karen Minde; Lie, Elisabeth; Wierød, Ada; Kalstad, Synøve; Rødevand, Erik; Krøll, Frode; Haugeberg, Glenn; Kvien, Tore K

    2017-11-01

    To investigate the predictive value of baseline depression/anxiety on the likelihood of achieving joint remission in rheumatoid arthritis (RA) and psoriatic arthritis (PsA) as well as the associations between baseline depression/anxiety and the components of the remission criteria at follow-up. We included 1326 patients with RA and 728 patients with PsA from the prospective observational NOR-DMARD study starting first-time tumour necrosis factor inhibitors or methotrexate. The predictive value of depression/anxiety on remission was explored in prespecified logistic regression models and the associations between baseline depression/anxiety and the components of the remission criteria in prespecified multiple linear regression models. Baseline depression/anxiety according to EuroQoL-5D-3L, Short Form-36 (SF-36) Mental Health subscale ≤56 and SF-36 Mental Component Summary ≤38 negatively predicted 28-joint Disease Activity Score <2.6, Simplified Disease Activity Index ≤3.3, Clinical Disease Activity Index ≤2.8, ACR/EULAR Boolean and Disease Activity Index for Psoriatic Arthritis ≤4 remission after 3 and 6 months treatment in RA (p≤0.008) and partly in PsA (p from 0.001 to 0.73). Baseline depression/anxiety was associated with increased patient's and evaluator's global assessment, tender joint count and joint pain in RA at follow-up, but not with swollen joint count and acute phase reactants. Depression and anxiety may reduce likelihood of joint remission based on composite scores in RA and PsA and should be taken into account in individual patients when making a shared decision on a treatment target. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Methodological factors affecting joint moments estimation in clinical gait analysis: a systematic review.

    PubMed

    Camomilla, Valentina; Cereatti, Andrea; Cutti, Andrea Giovanni; Fantozzi, Silvia; Stagni, Rita; Vannozzi, Giuseppe

    2017-08-18

    Quantitative gait analysis can provide a description of joint kinematics and dynamics, and it is recognized as a clinically useful tool for functional assessment, diagnosis and intervention planning. Clinically interpretable parameters are estimated from quantitative measures (i.e. ground reaction forces, skin marker trajectories, etc.) through biomechanical modelling. In particular, the estimation of joint moments during motion is grounded on several modelling assumptions: (1) body segmental and joint kinematics is derived from the trajectories of markers and by modelling the human body as a kinematic chain; (2) joint resultant (net) loads are, usually, derived from force plate measurements through a model of segmental dynamics. Therefore, both measurement errors and modelling assumptions can affect the results, to an extent that also depends on the characteristics of the motor task analysed (i.e. gait speed). Errors affecting the trajectories of joint centres, the orientation of joint functional axes, the joint angular velocities, the accuracy of inertial parameters and force measurements (concurring to the definition of the dynamic model), can weigh differently in the estimation of clinically interpretable joint moments. Numerous studies addressed all these methodological aspects separately, but a critical analysis of how these aspects may affect the clinical interpretation of joint dynamics is still missing. This article aims at filling this gap through a systematic review of the literature, conducted on Web of Science, Scopus and PubMed. The final objective is hence to provide clear take-home messages to guide laboratories in the estimation of joint moments for the clinical practice.

  6. Factors Associated with Young Adults’ Pregnancy Likelihood

    PubMed Central

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  7. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  8. Design/Analysis of Metal/Composite Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew E.

    2004-01-01

    A major design and analysis challenge for the JWST ISM structure is the metal/composite bonded joints that will be required to survive down to an operational ultra-low temperature of 30K (-405 F). The initial and current baseline design for the plug-type joint consists of a titanium thin walled fitting (1-3mm thick) bonded to the interior surface of an M555/954-6 composite truss square tube with an axially stiff biased lay-up. Metallic fittings are required at various nodes of the truss structure to accommodate instrument and lift-point bolted interfaces. Analytical experience and design work done on metal/composite bonded joints at temperatures below liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are virtually nonexistent. Increasing the challenge is the difficulty in testing for these required tools and parameters at 30K. A preliminary finite element analysis shows that failure due to CTE mismatch between the biased composite and titanium or aluminum is likely. Failure is less likely with Invar, however an initial mass estimate of Invar fittings demonstrates that Invar is not an automatic alternative. In order to gain confidence in analyzing and designing the ISM joints, a comprehensive joint development testing program has been planned and is currently running. The test program is designed for the correlation of the analysis methodology, including tuning finite element model parameters, and developing a composite failure criterion for the effect of multi-axial composite stresses on the strength of a bonded joint at 30K. The testing program will also consider stress mitigation using compliant composite layers and potential strength degradation due to multiple thermal cycles. Not only will the finite element analysis be correlated to the test data, but the FEA will be used to guide the design of the test. The first phase of the test program has been completed and the

  9. A general methodology for maximum likelihood inference from band-recovery data

    USGS Publications Warehouse

    Conroy, M.J.; Williams, B.K.

    1984-01-01

    A numerical procedure is described for obtaining maximum likelihood estimates and associated maximum likelihood inference from band- recovery data. The method is used to illustrate previously developed one-age-class band-recovery models, and is extended to new models, including the analysis with a covariate for survival rates and variable-time-period recovery models. Extensions to R-age-class band- recovery, mark-recapture models, and twice-yearly marking are discussed. A FORTRAN program provides computations for these models.

  10. Behaviour and Analysis of Mechanically Fastened Joints in Composite Structures

    DTIC Science & Technology

    1988-03-01

    Safety Factors for Use When Designing bolted Joints In GRP," Composites , April 1979, pp. M376. 93. Dastln, S., "Joining and Machining Techniques... MACHINE SPACER LOCKmm STEEL PLATE FASTENER 203 mm OR DOWEL FiN EXTENSOMETER EXTENSOMETER TGAUGE LENGTH ATTACHMENT COMPOSITE - PLATE 31 mm p NOTE: NOT TO...No.427 Behaviour and Analysis of Mechanically Fastened Joints in Composite Structures DTIC CXVTflUTION STATEME~r £ELECTE Approved fm Vubhc sIlam l JUL

  11. A Joint Model for Longitudinal Measurements and Survival Data in the Presence of Multiple Failure Types

    PubMed Central

    Elashoff, Robert M.; Li, Gang; Li, Ning

    2009-01-01

    Summary In this article we study a joint model for longitudinal measurements and competing risks survival data. Our joint model provides a flexible approach to handle possible nonignorable missing data in the longitudinal measurements due to dropout. It is also an extension of previous joint models with a single failure type, offering a possible way to model informatively censored events as a competing risk. Our model consists of a linear mixed effects submodel for the longitudinal outcome and a proportional cause-specific hazards frailty submodel (Prentice et al., 1978, Biometrics 34, 541-554) for the competing risks survival data, linked together by some latent random effects. We propose to obtain the maximum likelihood estimates of the parameters by an expectation maximization (EM) algorithm and estimate their standard errors using a profile likelihood method. The developed method works well in our simulation studies and is applied to a clinical trial for the scleroderma lung disease. PMID:18162112

  12. Neural Mechanisms for Integrating Prior Knowledge and Likelihood in Value-Based Probabilistic Inference

    PubMed Central

    Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.

    2015-01-01

    In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152

  13. Maximum Likelihood Analysis of Nonlinear Structural Equation Models with Dichotomous Variables

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2005-01-01

    In this article, a maximum likelihood approach is developed to analyze structural equation models with dichotomous variables that are common in behavioral, psychological and social research. To assess nonlinear causal effects among the latent variables, the structural equation in the model is defined by a nonlinear function. The basic idea of the…

  14. Localized cervical facet joint kinematics under physiological and whiplash loading.

    PubMed

    Stemper, Brian D; Yoganandan, Narayan; Gennarelli, Thomas A; Pintar, Frank A

    2005-12-01

    Although facet joints have been implicated in the whiplash injury mechanism, no investigators have determined the degree to which joint motions in whiplash are nonphysiological. The purpose of this investigation was to quantify the correlation between facet joint and segmental motions under physiological and whiplash loading. Human cadaveric cervical spine specimens were exercise tested under physiological extension loading, and intact human head-neck complexes were exercise tested under whiplash loading to correlate the localized component motions of the C4-5 facet joint with segmental extension. Facet joint shear and distraction kinematics demonstrated a linear correlation with segmental extension under both loading modes. Facet joints responded differently to whiplash and physiological loading, with significantly increased kinematics for the same-segmental angulation. The limitations of this study include removal of superficial musculature and the limited sample size for physiological testing. The presence of increased facet joint motions indicated that synovial joint soft-tissue components (that is, synovial membrane and capsular ligament) sustain increased distortion that may subject these tissues to a greater likelihood of injury. This finding is supported by clinical investigations in which lower cervical facet joint injury resulted in similar pain patterns due to the most commonly reported whiplash symptoms.

  15. Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions

    PubMed Central

    Barrett, Harrison H.; Dainty, Christopher; Lara, David

    2008-01-01

    Maximum-likelihood (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for likelihoods and Fisher information matrices are derived. The theory is illustrated by discussing Shack–Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack–Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods. PMID:17206255

  16. Markov Chain Monte Carlo Joint Analysis of Chandra X-Ray Imaging Spectroscopy and Sunyaev-Zel'dovich Effect Data

    NASA Technical Reports Server (NTRS)

    Bonamente, Massimillano; Joy, Marshall K.; Carlstrom, John E.; Reese, Erik D.; LaRoque, Samuel J.

    2004-01-01

    X-ray and Sunyaev-Zel'dovich effect data can be combined to determine the distance to galaxy clusters. High-resolution X-ray data are now available from Chandra, which provides both spatial and spectral information, and Sunyaev-Zel'dovich effect data were obtained from the BIMA and Owens Valley Radio Observatory (OVRO) arrays. We introduce a Markov Chain Monte Carlo procedure for the joint analysis of X-ray and Sunyaev- Zel'dovich effect data. The advantages of this method are the high computational efficiency and the ability to measure simultaneously the probability distribution of all parameters of interest, such as the spatial and spectral properties of the cluster gas and also for derivative quantities such as the distance to the cluster. We demonstrate this technique by applying it to the Chandra X-ray data and the OVRO radio data for the galaxy cluster A611. Comparisons with traditional likelihood ratio methods reveal the robustness of the method. This method will be used in follow-up paper to determine the distances to a large sample of galaxy cluster.

  17. Pain anticipation: an activation likelihood estimation meta-analysis of brain imaging studies.

    PubMed

    Palermo, Sara; Benedetti, Fabrizio; Costa, Tommaso; Amanzio, Martina

    2015-05-01

    The anticipation of pain has been investigated in a variety of brain imaging studies. Importantly, today there is no clear overall picture of the areas that are involved in different studies and the exact role of these regions in pain expectation remains especially unexploited. To address this issue, we used activation likelihood estimation meta-analysis to analyze pain anticipation in several neuroimaging studies. A total of 19 functional magnetic resonance imaging were included in the analysis to search for the cortical areas involved in pain anticipation in human experimental models. During anticipation, activated foci were found in the dorsolateral prefrontal, midcingulate and anterior insula cortices, medial and inferior frontal gyri, inferior parietal lobule, middle and superior temporal gyrus, thalamus, and caudate. Deactivated foci were found in the anterior cingulate, superior frontal gyrus, parahippocampal gyrus and in the claustrum. The results of the meta-analytic connectivity analysis provide an overall view of the brain responses triggered by the anticipation of a noxious stimulus. Such a highly distributed perceptual set of self-regulation may prime brain regions to process information where emotion, action and perception as well as their related subcategories play a central role. Not only do these findings provide important information on the neural events when anticipating pain, but also they may give a perspective into nocebo responses, whereby negative expectations may lead to pain worsening. © 2014 Wiley Periodicals, Inc.

  18. Analysis of Contraction Joint Width Influence on Load Stress of Pavement Panels

    NASA Astrophysics Data System (ADS)

    Gao, Wei; Cui, Wei; Sun, Wei

    2018-05-01

    The width of transverse contraction joint of the cement road varies with temperatures, which leads to changes in load transmission among plates of the road surface and affects load stress of the road plates. Three-dimensional element analysis software EverFE is used to address the relation between the contraction joint width and road surface load stress, revealing the impact of reducing contraction joint width. The results could be of critical value in maintaining road functions and extending the service life of cement road surfaces.

  19. Analysis of the Constraint Joint Loading in the Thumb During Pipetting.

    PubMed

    Wu, John Z; Sinsel, Erik W; Zhao, Kristin D; An, Kai-Nan; Buczek, Frank L

    2015-08-01

    Dynamic loading on articular joints is essential for the evaluation of the risk of the articulation degeneration associated with occupational activities. In the current study, we analyzed the dynamic constraint loading for the thumb during pipetting. The constraint loading is considered as the loading that has to be carried by the connective tissues of the joints (i.e., the cartilage layer and the ligaments) to maintain the kinematic constraints of the system. The joint loadings are solved using a classic free-body approach, using the external loading and muscle forces, which were obtained in an inverse dynamic approach combined with an optimization procedure in anybody. The constraint forces in the thumb joint obtained in the current study are compared with those obtained in the pinch and grasp tests in a previous study (Cooney and Chao, 1977, "Biomechanical Analysis of Static Forces in the Thumb During Hand Function," J. Bone Joint Surg. Am., 59(1), pp. 27-36). The maximal compression force during pipetting is approximately 83% and 60% greater than those obtained in the tip pinch and key pinch, respectively, while substantially smaller than that obtained during grasping. The maximal lateral shear force is approximately six times, 32 times, and 90% greater than those obtained in the tip pinch, key pinch, and grasp, respectively. The maximal dorsal shear force during pipetting is approximately 3.2 and 1.4 times greater than those obtained in the tip pinch and key pinch, respectively, while substantially smaller than that obtained during grasping. Our analysis indicated that the thumb joints are subjected to repetitive, intensive loading during pipetting, compared to other daily activities.

  20. Design of simplified maximum-likelihood receivers for multiuser CPM systems.

    PubMed

    Bing, Li; Bai, Baoming

    2014-01-01

    A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.

  1. Stress analysis method for clearance-fit joints with bearing-bypass loads

    NASA Technical Reports Server (NTRS)

    Naik, R. A.; Crews, J. H., Jr.

    1989-01-01

    Within a multi-fastener joint, fastener holes may be subjected to the combined effects of bearing loads and loads that bypass the hole to be reacted elsewhere in the joint. The analysis of a joint subjected to search combined bearing and bypass loads is complicated by the usual clearance between the hole and the fastener. A simple analysis method for such clearance-fit joints subjected to bearing-bypass loading has been developed in the present study. It uses an inverse formulation with a linear elastic finite-element analysis. Conditions along the bolt-hole contact arc are specified by displacement constraint equations. The present method is simple to apply and can be implemented with most general purpose finite-element programs since it does not use complicated iterative-incremental procedures. The method was used to study the effects of bearing-bypass loading on bolt-hole contact angles and local stresses. In this study, a rigid, frictionless bolt was used with a plate having the properties of a quasi-isotropic graphite/epoxy laminate. Results showed that the contact angle as well as the peak stresses around the hole and their locations were strongly influenced by the ratio of bearing and bypass loads. For single contact, tension and compression bearing-bypass loading had opposite effects on the contact angle. For some compressive bearing-bypass loads, the hole tended to close on the fastener leading to dual contact. It was shown that dual contact reduces the stress concentration at the fastener and would, therefore, increase joint strength in compression. The results illustrate the general importance of accounting for bolt-hole clearance and contact to accurately compute local bolt-hole stresses for combined bearings and bypass loading.

  2. Maximum Likelihood Estimation with Emphasis on Aircraft Flight Data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1985-01-01

    Accurate modeling of flexible space structures is an important field that is currently under investigation. Parameter estimation, using methods such as maximum likelihood, is one of the ways that the model can be improved. The maximum likelihood estimator has been used to extract stability and control derivatives from flight data for many years. Most of the literature on aircraft estimation concentrates on new developments and applications, assuming familiarity with basic estimation concepts. Some of these basic concepts are presented. The maximum likelihood estimator and the aircraft equations of motion that the estimator uses are briefly discussed. The basic concepts of minimization and estimation are examined for a simple computed aircraft example. The cost functions that are to be minimized during estimation are defined and discussed. Graphic representations of the cost functions are given to help illustrate the minimization process. Finally, the basic concepts are generalized, and estimation from flight data is discussed. Specific examples of estimation of structural dynamics are included. Some of the major conclusions for the computed example are also developed for the analysis of flight data.

  3. Planck intermediate results. XVI. Profile likelihoods for cosmological parameters

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bonaldi, A.; Bond, J. R.; Bouchet, F. R.; Burigana, C.; Cardoso, J.-F.; Catalano, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Couchot, F.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lawrence, C. R.; Leonardi, R.; Liddle, A.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Mazzotta, P.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski∗, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rouillé d'Orfeuil, B.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Savelainen, M.; Savini, G.; Spencer, L. D.; Spinelli, M.; Starck, J.-L.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; White, M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-06-01

    We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the ΛCDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agreement with the cosmological results from the Bayesian framework is excellent, demonstrating the robustness of the Planck results to the statistical methodology. We investigate the inclusion of neutrino masses, where more significant differences may appear due to the non-Gaussian nature of the posterior mass distribution. By applying the Feldman-Cousins prescription, we again obtain results very similar to those of the Bayesian methodology. However, the profile-likelihood analysis of the cosmic microwave background (CMB) combination (Planck+WP+highL) reveals a minimum well within the unphysical negative-mass region. We show that inclusion of the Planck CMB-lensing information regularizes this issue, and provide a robust frequentist upper limit ∑ mν ≤ 0.26 eV (95% confidence) from the CMB+lensing+BAO data combination.

  4. Likelihood-based confidence intervals for estimating floods with given return periods

    NASA Astrophysics Data System (ADS)

    Martins, Eduardo Sávio P. R.; Clarke, Robin T.

    1993-06-01

    This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.

  5. Analysis of an all-digital maximum likelihood carrier phase and clock timing synchronizer for eight phase-shift keying modulation

    NASA Astrophysics Data System (ADS)

    Degaudenzi, Riccardo; Vanghi, Vieri

    1994-02-01

    In all-digital Trellis-Coded 8PSK (TC-8PSK) demodulator well suited for VLSI implementation, including maximum likelihood estimation decision-directed (MLE-DD) carrier phase and clock timing recovery, is introduced and analyzed. By simply removing the trellis decoder the demodulator can efficiently cope with uncoded 8PSK signals. The proposed MLE-DD synchronization algorithm requires one sample for the phase and two samples per symbol for the timing loop. The joint phase and timing discriminator characteristics are analytically derived and numerical results checked by means of computer simulations. An approximated expression for steady-state carrier phase and clock timing mean square error has been derived and successfully checked with simulation findings. Synchronizer deviation from the Cramer Rao bound is also discussed. Mean acquisition time for the digital synchronizer has also been computed and checked, using the Monte Carlo simulation technique. Finally, TC-8PSK digital demodulator performance in terms of bit error rate and mean time to lose lock, including digital interpolators and synchronization loops, is presented.

  6. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  7. Approximate likelihood calculation on a phylogeny for Bayesian estimation of divergence times.

    PubMed

    dos Reis, Mario; Yang, Ziheng

    2011-07-01

    The molecular clock provides a powerful way to estimate species divergence times. If information on some species divergence times is available from the fossil or geological record, it can be used to calibrate a phylogeny and estimate divergence times for all nodes in the tree. The Bayesian method provides a natural framework to incorporate different sources of information concerning divergence times, such as information in the fossil and molecular data. Current models of sequence evolution are intractable in a Bayesian setting, and Markov chain Monte Carlo (MCMC) is used to generate the posterior distribution of divergence times and evolutionary rates. This method is computationally expensive, as it involves the repeated calculation of the likelihood function. Here, we explore the use of Taylor expansion to approximate the likelihood during MCMC iteration. The approximation is much faster than conventional likelihood calculation. However, the approximation is expected to be poor when the proposed parameters are far from the likelihood peak. We explore the use of parameter transforms (square root, logarithm, and arcsine) to improve the approximation to the likelihood curve. We found that the new methods, particularly the arcsine-based transform, provided very good approximations under relaxed clock models and also under the global clock model when the global clock is not seriously violated. The approximation is poorer for analysis under the global clock when the global clock is seriously wrong and should thus not be used. The results suggest that the approximate method may be useful for Bayesian dating analysis using large data sets.

  8. Likelihood Ratios for the Emergency Physician.

    PubMed

    Peng, Paul; Coyle, Andrew

    2018-04-26

    The concept of likelihood ratios was introduced more than 40 years ago, yet this powerful metric has still not seen wider application or discussion in the medical decision-making process. There is concern that clinicians-in-training are still being taught an over-simplified approach to diagnostic test performance, and have limited exposure to likelihood ratios. Even for those familiar with likelihood ratios, they might perceive them as mathematically-cumbersome in application, if not difficult to determine for a particular disease process. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  9. Prosthetic Joint Infections and Cost Analysis?

    PubMed

    Haddad, F S; Ngu, A; Negus, J J

    2017-01-01

    Prosthetic joint infection is a devastating complication of arthroplasty surgery that can lead to debilitating morbidity for the patient and significant expense for the healthcare system. With the continual rise of arthroplasty cases worldwide every year, the revision load for infection is becoming a greater financial burden on healthcare budgets. Prevention of infection has to be the key to reducing this burden. For treatment, it is critical for us to collect quality data that can guide future management strategies to minimise healthcare costs and morbidity / mortality for patients. There has been a management shift in many countries to a less expensive 1-stage strategy and in selective cases to the use of debridement, antibiotics and implant retention. These appear very attractive options on many levels, not least cost. However, with a consensus on the definition of joint infection only clarified in 2011, there is still the need for high quality cost analysis data to be collected on how the use of these different methods could impact the healthcare expenditure of countries around the world. With a projected spend on revision for infection at US$1.62 billion in the US alone, this data is vital and urgently needed.

  10. Radiologic Analysis and Clinical Study of the Upper One-third Joint Technique for Fluoroscopically Guided Sacroiliac Joint Injection.

    PubMed

    Park, Junghyun; Park, Hue Jung; Moon, Dong Eon; Sa, Gye Jeol; Kim, Young Hoon

    2015-01-01

    Sacroiliac intraarticular injection by the traditional technique can be challenging to perform when the joint is covered with osteophytes or is extremely narrow. To examine whether there is enough space for the needle to be advanced from the L5-S1 interspinous space to the upper one-third sacroiliac joint (SIJ) by magnetic resonance image (MRI) analysis as an alternative to fluoroscopically guided SIJ injection with the lower one-third joint technique, and to determine the feasibility of this novel technique in clinical practice. MRI analysis and observational study. An interventional pain management practice at a university hospital. We analyzed 200 axial T2-weighted MRIs between the L5 and S1 vertebrae of 100 consecutive patients. The following measurements were obtained on both sides: 1) the thickness of fat in the midline; 2) the distance between the midline (Point C) and the junction (Point A) of the skin and the imaginary line that connects the SIJ and the most medial cortex of the ilium; 3) the distance between the midline (Point C) and the junction (Point B) of the skin and the imaginary line that connects the SIJ and the L5 spinous process; 4) the distance between the SIJ and midline (Point C) on the skin, or between the SIJ and the midpoint (Point C') of the line from Point A to Point B; and 5) the angle between the sagittal line and the imaginary line that connects the SIJ and the midline on the skin. The upper one-third joint technique was performed to establish the feasibility of the alternative technique in 20 patients who had unsuccessful sacroiliac intraarticular injections using the lower one-third joint technique. The mean distances from the midline to Point A and to Point B were 21.9 ± 13.7 mm and 27.8 ± 13.6 mm, respectively. The mean distance between the SIJ and Point C (or Point C') was 81.0 ± 13.3 mm. The angle between the sagittal line and the imaginary line that connects the SIJ and the midline on the skin was 42.8 ± 5.1°. The success

  11. Analysis of Knee Joint Line Obliquity after High Tibial Osteotomy.

    PubMed

    Oh, Kwang-Jun; Ko, Young Bong; Bae, Ji Hoon; Yoon, Suk Tae; Kim, Jae Gyoon

    2016-11-01

    The aim of this study was to evaluate which lower extremity alignment (knee and ankle joint) parameters affect knee joint line obliquity (KJLO) in the coronal plane after open wedge high tibial osteotomy (OWHTO). Overall, 69 knees of patients that underwent OWHTO were evaluated using radiographs obtained preoperatively and from 6 weeks to 3 months postoperatively. We measured multiple parameters of knee and ankle joint alignment (hip-knee-ankle angle [HKA], joint line height [JLH], posterior tibial slope [PS], femoral condyle-tibial plateau angle [FCTP], medial proximal tibial angle [MPTA], mechanical lateral distal femoral angle [mLDFA], KJLO, talar tilt angle [TTA], ankle joint obliquity [AJO], and the lateral distal tibial ground surface angle [LDTGA]; preoperative [-pre], postoperative [-post], and the difference between -pre and -post values [-Δ]). We categorized patients into two groups according to the KJLO-post value (the normal group [within ± 4 degrees, 56 knees] and the abnormal group [greater than ± 4 degrees, 13 knees]), and compared their -pre parameters. Multiple logistic regression analysis was used to examine the contribution of the -pre parameters to abnormal KJLO-post. The mean HKA-Δ (-9.4 ± 4.7 degrees) was larger than the mean KJLO-Δ (-2.1 ± 3.2 degrees). The knee joint alignment parameters (the HKA-pre, FCTP-pre) differed significantly between the two groups ( p  < 0.05). In addition, the HKA-pre (odds ratio [OR] = 1.27, p  = 0.006) and FCTP-pre (OR = 2.13, p  = 0.006) were significant predictors of abnormal KJLO-post. However, -pre ankle joint parameters (TTA, AJO, and LDTGA) did not differ significantly between the two groups and were not significantly associated with the abnormal KJLO-post. The -pre knee joint alignment and knee joint convergence angle evaluated by HKA-pre and FCTP-pre angle, respectively, were significant predictors of abnormal KJLO after OWHTO. However, -pre ankle joint

  12. An EM Algorithm for Maximum Likelihood Estimation of Process Factor Analysis Models

    ERIC Educational Resources Information Center

    Lee, Taehun

    2010-01-01

    In this dissertation, an Expectation-Maximization (EM) algorithm is developed and implemented to obtain maximum likelihood estimates of the parameters and the associated standard error estimates characterizing temporal flows for the latent variable time series following stationary vector ARMA processes, as well as the parameters defining the…

  13. Improvement and comparison of likelihood functions for model calibration and parameter uncertainty analysis within a Markov chain Monte Carlo scheme

    NASA Astrophysics Data System (ADS)

    Cheng, Qin-Bo; Chen, Xi; Xu, Chong-Yu; Reinhardt-Imjela, Christian; Schulte, Achim

    2014-11-01

    In this study, the likelihood functions for uncertainty analysis of hydrological models are compared and improved through the following steps: (1) the equivalent relationship between the Nash-Sutcliffe Efficiency coefficient (NSE) and the likelihood function with Gaussian independent and identically distributed residuals is proved; (2) a new estimation method of the Box-Cox transformation (BC) parameter is developed to improve the effective elimination of the heteroscedasticity of model residuals; and (3) three likelihood functions-NSE, Generalized Error Distribution with BC (BC-GED) and Skew Generalized Error Distribution with BC (BC-SGED)-are applied for SWAT-WB-VSA (Soil and Water Assessment Tool - Water Balance - Variable Source Area) model calibration in the Baocun watershed, Eastern China. Performances of calibrated models are compared using the observed river discharges and groundwater levels. The result shows that the minimum variance constraint can effectively estimate the BC parameter. The form of the likelihood function significantly impacts on the calibrated parameters and the simulated results of high and low flow components. SWAT-WB-VSA with the NSE approach simulates flood well, but baseflow badly owing to the assumption of Gaussian error distribution, where the probability of the large error is low, but the small error around zero approximates equiprobability. By contrast, SWAT-WB-VSA with the BC-GED or BC-SGED approach mimics baseflow well, which is proved in the groundwater level simulation. The assumption of skewness of the error distribution may be unnecessary, because all the results of the BC-SGED approach are nearly the same as those of the BC-GED approach.

  14. Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  15. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood.

    PubMed

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific 'problem music' like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals' risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  16. The Joint Physics Analysis Center: Recent results

    NASA Astrophysics Data System (ADS)

    Fernández-Ramírez, César

    2016-10-01

    We review some of the recent achievements of the Joint Physics Analysis Center, a theoretical collaboration with ties to experimental collaborations, that aims to provide amplitudes suitable for the analysis of the current and forthcoming experimental data on hadron physics. Since its foundation in 2013, the group is focused on hadron spectroscopy in preparation for the forthcoming high statistics and high precision experimental data from BELLEII, BESIII, CLAS12, COMPASS, GlueX, LHCb and (hopefully) PANDA collaborations. So far, we have developed amplitudes for πN scattering, KN scattering, pion and J/ψ photoproduction, two kaon photoproduction and three-body decays of light mesons (η, ω, ϕ). The codes for the amplitudes are available to download from the group web page and can be straightforwardly incorporated to the analysis of the experimental data.

  17. IMU-Based Joint Angle Measurement for Gait Analysis

    PubMed Central

    Seel, Thomas; Raisch, Jorg; Schauer, Thomas

    2014-01-01

    This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1) joint axis and position identification; and (2) flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU)-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°. PMID:24743160

  18. Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental and Workflow Protocols

    DTIC Science & Technology

    2016-06-01

    unlimited. v List of Tables Table 1 Single-lap-joint experimental parameters ..............................................7 Table 2 Survey ...Joints: Experimental and Workflow Protocols by Robert E Jensen, Daniel C DeSchepper, and David P Flanagan Approved for...TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental

  19. Section 9: Ground Water - Likelihood of Release

    EPA Pesticide Factsheets

    HRS training. the ground water pathway likelihood of release factor category reflects the likelihood that there has been, or will be, a release of hazardous substances in any of the aquifers underlying the site.

  20. Joint Multifractal Analysis of penetration resistance variability in an olive orchard.

    NASA Astrophysics Data System (ADS)

    Lopez-Herrera, Juan; Herrero-Tejedor, Tomas; Saa-Requejo, Antonio; Villeta, Maria; Tarquis, Ana M.

    2016-04-01

    Spatial variability of soil properties is relevant for identifying those zones with physical degradation. We used descriptive statistics and multifractal analysis for characterizing the spatial patterns of soil penetrometer resistance (PR) distributions and compare them at different soil depths and soil water content to investigate the tillage effect in soil compactation. The study was conducted on an Inceptisol dedicated to olive orchard for the last 70 years. Two parallel transects of 64 m were selected as different soil management plots, conventional tillage (CT) and no tillage (NT). Penetrometer resistance readings were carried out at 50 cm intervals within the first 20 cm of soil depth (López de Herrera et al., 2015a). Two way ANOVA highlighted that tillage system, soil depth and their interaction are statistically significant to explain the variance of PR data. The comparison of CT and NT results at different depths showed that there are significant differences deeper than 10 cm but not in the first two soil layers. The scaling properties of each PR profile was characterized by τ(q) function, calculated in the range of moment orders (q) between -5 and +5 taken at 0.5 lag increments. Several parameters were calculated from this to establish different comparisons (López de Herrera et al., 2015b). While the multifractal analysis characterizes the distribution of a single variable along its spatial support, the joint multifractal analysis can be used to characterize the joint distribution of two or more variables along a common spatial support (Kravchenko et al., 2000; Zeleke and Si, 2004). This type of analysis was performed to study the scaling properties of the joint distribution of PR at different depths. The results showed that this type of analysis added valuable information to describe the spatial arrangement of depth-dependent penetrometer data sets in all the soil layers. References Kravchenko AN, Bullock DG, Boast CW (2000) Joint multifractal

  1. Physical constraints on the likelihood of life on exoplanets

    NASA Astrophysics Data System (ADS)

    Lingam, Manasvi; Loeb, Abraham

    2018-04-01

    One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.

  2. Experiments and kinematics analysis of a hand rehabilitation exoskeleton with circuitous joints.

    PubMed

    Zhang, Fuhai; Fu, Yili; Zhang, Qinchao; Wang, Shuguo

    2015-01-01

    Aiming at the hand rehabilitation of stroke patients, a wearable hand exoskeleton with circuitous joint is proposed. The circuitous joint adopts the symmetric pinion and rack mechanism (SPRM) with the parallel mechanism. The exoskeleton finger is a serial mechanism composed of three closed-chain SPRM joints in series. The kinematic equations of the open chain of the finger and the closed chains of the SPRM joints were built to analyze the kinematics of the hand rehabilitation exoskeleton. The experimental setup of the hand rehabilitation exoskeleton was built and the continuous passive motion (CPM) rehabilitation experiment and the test of human-robot interaction force measurement were conducted. Experiment results show that the mechanical design of the hand rehabilitation robot is reasonable and that the kinematic analysis is correct, thus the exoskeleton can be used for the hand rehabilitation of stroke patients.

  3. Neuroimaging of Reading Intervention: A Systematic Review and Activation Likelihood Estimate Meta-Analysis

    PubMed Central

    Barquero, Laura A.; Davis, Nicole; Cutting, Laurie E.

    2014-01-01

    A growing number of studies examine instructional training and brain activity. The purpose of this paper is to review the literature regarding neuroimaging of reading intervention, with a particular focus on reading difficulties (RD). To locate relevant studies, searches of peer-reviewed literature were conducted using electronic databases to search for studies from the imaging modalities of fMRI and MEG (including MSI) that explored reading intervention. Of the 96 identified studies, 22 met the inclusion criteria for descriptive analysis. A subset of these (8 fMRI experiments with post-intervention data) was subjected to activation likelihood estimate (ALE) meta-analysis to investigate differences in functional activation following reading intervention. Findings from the literature review suggest differences in functional activation of numerous brain regions associated with reading intervention, including bilateral inferior frontal, superior temporal, middle temporal, middle frontal, superior frontal, and postcentral gyri, as well as bilateral occipital cortex, inferior parietal lobules, thalami, and insulae. Findings from the meta-analysis indicate change in functional activation following reading intervention in the left thalamus, right insula/inferior frontal, left inferior frontal, right posterior cingulate, and left middle occipital gyri. Though these findings should be interpreted with caution due to the small number of studies and the disparate methodologies used, this paper is an effort to synthesize across studies and to guide future exploration of neuroimaging and reading intervention. PMID:24427278

  4. On Muthen's Maximum Likelihood for Two-Level Covariance Structure Models

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Hayashi, Kentaro

    2005-01-01

    Data in social and behavioral sciences are often hierarchically organized. Special statistical procedures that take into account the dependence of such observations have been developed. Among procedures for 2-level covariance structure analysis, Muthen's maximum likelihood (MUML) has the advantage of easier computation and faster convergence. When…

  5. Empirical likelihood inference in randomized clinical trials.

    PubMed

    Zhang, Biao

    2017-01-01

    In individually randomized controlled trials, in addition to the primary outcome, information is often available on a number of covariates prior to randomization. This information is frequently utilized to undertake adjustment for baseline characteristics in order to increase precision of the estimation of average treatment effects; such adjustment is usually performed via covariate adjustment in outcome regression models. Although the use of covariate adjustment is widely seen as desirable for making treatment effect estimates more precise and the corresponding hypothesis tests more powerful, there are considerable concerns that objective inference in randomized clinical trials can potentially be compromised. In this paper, we study an empirical likelihood approach to covariate adjustment and propose two unbiased estimating functions that automatically decouple evaluation of average treatment effects from regression modeling of covariate-outcome relationships. The resulting empirical likelihood estimator of the average treatment effect is as efficient as the existing efficient adjusted estimators 1 when separate treatment-specific working regression models are correctly specified, yet are at least as efficient as the existing efficient adjusted estimators 1 for any given treatment-specific working regression models whether or not they coincide with the true treatment-specific covariate-outcome relationships. We present a simulation study to compare the finite sample performance of various methods along with some results on analysis of a data set from an HIV clinical trial. The simulation results indicate that the proposed empirical likelihood approach is more efficient and powerful than its competitors when the working covariate-outcome relationships by treatment status are misspecified.

  6. Analysis of bonded joints. [shear stress and stress-strain diagrams

    NASA Technical Reports Server (NTRS)

    Srinivas, S.

    1975-01-01

    A refined elastic analysis of bonded joints which accounts for transverse shear deformation and transverse normal stress was developed to obtain the stresses and displacements in the adherends and in the bond. The displacements were expanded in terms of polynomials in the thicknesswise coordinate; the coefficients of these polynomials were functions of the axial coordinate. The stress distribution was obtained in terms of these coefficients by using strain-displacement and stress-strain relations. The governing differential equations were obtained by integrating the equations of equilibrium, and were solved. The boundary conditions (interface or support) were satisfied to complete the analysis. Single-lap, flush, and double-lap joints were analyzed, along with the effects of adhesive properties, plate thicknesses, material properties, and plate taper on maximum peel and shear stresses in the bond. The results obtained by using the thin-beam analysis available in the literature were compared with the results obtained by using the refined analysis. In general, thin-beam analysis yielded reasonably accurate results, but in certain cases the errors were high. Numerical investigations showed that the maximum peel and shear stresses in the bond can be reduced by (1) using a combination of flexible and stiff bonds, (2) using stiffer lap plates, and (3) tapering the plates.

  7. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    PubMed Central

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety. PMID:28539908

  8. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  9. Maximum likelihood sequence estimation for optical complex direct modulation.

    PubMed

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  10. Gray matter atrophy in narcolepsy: An activation likelihood estimation meta-analysis.

    PubMed

    Weng, Hsu-Huei; Chen, Chih-Feng; Tsai, Yuan-Hsiung; Wu, Chih-Ying; Lee, Meng; Lin, Yu-Ching; Yang, Cheng-Ta; Tsai, Ying-Huang; Yang, Chun-Yuh

    2015-12-01

    The authors reviewed the literature on the use of voxel-based morphometry (VBM) in narcolepsy magnetic resonance imaging (MRI) studies via the use of a meta-analysis of neuroimaging to identify concordant and specific structural deficits in patients with narcolepsy as compared with healthy subjects. We used PubMed to retrieve articles published between January 2000 and March 2014. The authors included all VBM research on narcolepsy and compared the findings of the studies by using gray matter volume (GMV) or gray matter concentration (GMC) to index differences in gray matter. Stereotactic data were extracted from 8 VBM studies of 149 narcoleptic patients and 162 control subjects. We applied activation likelihood estimation (ALE) technique and found significant regional gray matter reduction in the bilateral hypothalamus, thalamus, globus pallidus, extending to nucleus accumbens (NAcc) and anterior cingulate cortex (ACC), left mid orbital and rectal gyri (BAs 10 and 11), right inferior frontal gyrus (BA 47), and the right superior temporal gyrus (BA 41) in patients with narcolepsy. The significant gray matter deficits in narcoleptic patients occurred in the bilateral hypothalamus and frontotemporal regions, which may be related to the emotional processing abnormalities and orexin/hypocretin pathway common among populations of patients with narcolepsy. Copyright © 2015. Published by Elsevier Ltd.

  11. Empirical likelihood method for non-ignorable missing data problems.

    PubMed

    Guan, Zhong; Qin, Jing

    2017-01-01

    Missing response problem is ubiquitous in survey sampling, medical, social science and epidemiology studies. It is well known that non-ignorable missing is the most difficult missing data problem where the missing of a response depends on its own value. In statistical literature, unlike the ignorable missing data problem, not many papers on non-ignorable missing data are available except for the full parametric model based approach. In this paper we study a semiparametric model for non-ignorable missing data in which the missing probability is known up to some parameters, but the underlying distributions are not specified. By employing Owen (1988)'s empirical likelihood method we can obtain the constrained maximum empirical likelihood estimators of the parameters in the missing probability and the mean response which are shown to be asymptotically normal. Moreover the likelihood ratio statistic can be used to test whether the missing of the responses is non-ignorable or completely at random. The theoretical results are confirmed by a simulation study. As an illustration, the analysis of a real AIDS trial data shows that the missing of CD4 counts around two years are non-ignorable and the sample mean based on observed data only is biased.

  12. Two‐phase designs for joint quantitative‐trait‐dependent and genotype‐dependent sampling in post‐GWAS regional sequencing

    PubMed Central

    Espin‐Garcia, Osvaldo; Craiu, Radu V.

    2017-01-01

    ABSTRACT We evaluate two‐phase designs to follow‐up findings from genome‐wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation‐maximization‐based inference under a semiparametric maximum likelihood formulation tailored for post‐GWAS inference. A GWAS‐SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT‐SNP‐dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme‐QT strata yields significant power improvements compared to marginal QT‐ or SNP‐based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure. PMID:29239496

  13. Maintained Individual Data Distributed Likelihood Estimation (MIDDLE)

    PubMed Central

    Boker, Steven M.; Brick, Timothy R.; Pritikin, Joshua N.; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D.; Maes, Hermine H.; Neale, Michael C.

    2015-01-01

    Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly-impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participants’ personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual’s data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies. PMID:26717128

  14. Analysis of pain and painless symptoms in temporomandibular joints dysfunction in adult patients.

    PubMed

    Górecka, Małgorzata; Pihut, Małgorzata; Kulesa-Mrowiecka, Małgorzata

    2017-01-01

    Recent years have shown an increase in the number of patients reporting for treatment of pain due to musculoskeletal joint, associated with temporomandibular joint dysfunction. Therefore, studies were undertaken, aimed at analyzing the symptoms of the dysfunction, because of which patients come to the prosthetic treatment. Aim of the thesis: The aim of the study was a retrospective analysis of symptoms of temporomandibular joint dysfunction reported by patients diagnosed with this problem. The research material was a retrospective medical records of 120 patients, aged 19 to 45 years who have taken prosthetic treatment due to temporomandibular joint dysfunction in the Consulting Room in Prosthetics Department in Kraków, from June 2015 to December 2016. During the test patients, in addition to interviewing a physician, completed a personal survey in their own study. The material has been divided into I group of patients who reported pain form of dysfunction and II group, who had no symptoms of pain within the stomatognatic system. The analysis covered type of symptoms, the share of local factors (para-functions) and systemic, as well as the time a er which the patients reported for the treatment of functional disorders since the appearance of the first symptoms. Analysis of the research material showed that the main reason for reporting patients was pain in one or both temporal joints of significant intensity (5 to 8 in VAS scale,) accompanied by acoustic symptoms. A large group of questioners reported problems with the range of jaw movement and head and face pain, as well as subjective symptoms from the auditory, sight, neck, neck and shoulder areas.

  15. Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.

    PubMed

    Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien

    2017-01-01

    Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.

  16. Should I Text or Call Here? A Situation-Based Analysis of Drivers' Perceived Likelihood of Engaging in Mobile Phone Multitasking.

    PubMed

    Oviedo-Trespalacios, Oscar; Haque, Md Mazharul; King, Mark; Washington, Simon

    2018-05-29

    This study investigated how situational characteristics typically encountered in the transport system influence drivers' perceived likelihood of engaging in mobile phone multitasking. The impacts of mobile phone tasks, perceived environmental complexity/risk, and drivers' individual differences were evaluated as relevant individual predictors within the behavioral adaptation framework. An innovative questionnaire, which includes randomized textual and visual scenarios, was administered to collect data from a sample of 447 drivers in South East Queensland-Australia (66% females; n = 296). The likelihood of engaging in a mobile phone task across various scenarios was modeled by a random parameters ordered probit model. Results indicated that drivers who are female, are frequent users of phones for texting/answering calls, have less favorable attitudes towards safety, and are highly disinhibited were more likely to report stronger intentions of engaging in mobile phone multitasking. However, more years with a valid driving license, self-efficacy toward self-regulation in demanding traffic conditions and police enforcement, texting tasks, and demanding traffic conditions were negatively related to self-reported likelihood of mobile phone multitasking. The unobserved heterogeneity warned of riskier groups among female drivers and participants who need a lot of convincing to believe that multitasking while driving is dangerous. This research concludes that behavioral adaptation theory is a robust framework explaining self-regulation of distracted drivers. © 2018 Society for Risk Analysis.

  17. Fast maximum likelihood estimation of mutation rates using a birth-death process.

    PubMed

    Wu, Xiaowei; Zhu, Hongxiao

    2015-02-07

    Since fluctuation analysis was first introduced by Luria and Delbrück in 1943, it has been widely used to make inference about spontaneous mutation rates in cultured cells. Under certain model assumptions, the probability distribution of the number of mutants that appear in a fluctuation experiment can be derived explicitly, which provides the basis of mutation rate estimation. It has been shown that, among various existing estimators, the maximum likelihood estimator usually demonstrates some desirable properties such as consistency and lower mean squared error. However, its application in real experimental data is often hindered by slow computation of likelihood due to the recursive form of the mutant-count distribution. We propose a fast maximum likelihood estimator of mutation rates, MLE-BD, based on a birth-death process model with non-differential growth assumption. Simulation studies demonstrate that, compared with the conventional maximum likelihood estimator derived from the Luria-Delbrück distribution, MLE-BD achieves substantial improvement on computational speed and is applicable to arbitrarily large number of mutants. In addition, it still retains good accuracy on point estimation. Published by Elsevier Ltd.

  18. Likelihood analysis of the chalcone synthase genes suggests the role of positive selection in morning glories (Ipomoea).

    PubMed

    Yang, Ji; Gu, Hongya; Yang, Ziheng

    2004-01-01

    Chalcone synthase (CHS) is a key enzyme in the biosynthesis of flavonoides, which are important for the pigmentation of flowers and act as attractants to pollinators. Genes encoding CHS constitute a multigene family in which the copy number varies among plant species and functional divergence appears to have occurred repeatedly. In morning glories (Ipomoea), five functional CHS genes (A-E) have been described. Phylogenetic analysis of the Ipomoea CHS gene family revealed that CHS A, B, and C experienced accelerated rates of amino acid substitution relative to CHS D and E. To examine whether the CHS genes of the morning glories underwent adaptive evolution, maximum-likelihood models of codon substitution were used to analyze the functional sequences in the Ipomoea CHS gene family. These models used the nonsynonymous/synonymous rate ratio (omega = d(N)/ d(S)) as an indicator of selective pressure and allowed the ratio to vary among lineages or sites. Likelihood ratio test suggested significant variation in selection pressure among amino acid sites, with a small proportion of them detected to be under positive selection along the branches ancestral to CHS A, B, and C. Positive Darwinian selection appears to have promoted the divergence of subfamily ABC and subfamily DE and is at least partially responsible for a rate increase following gene duplication.

  19. Medical image analysis of knee joint lipoma arborescens and arthroscopic treatment.

    PubMed

    Zhu, Guangyu; Tian, Xiangdong; Du, Dongfeng; Lei, Ming; Guan, Lei; Wang, Jian; Tan, Yetong; Yang, Chen; Zheng, Xinxin

    2018-06-01

    Arthroscopy is a minimally invasive surgical procedure on a joint in which examination and treatment of knee damage is performed using a surgical device known as the arthroscope. Lipoma arborescens (LA), an infrequent intra-articular lesion, originates from mature adipose cells under subsynovial tissue. The synovial membrane is pale yellow with large villous projections. It is caused by various underlying factors. We found many patients with LA and processed them appropriately.The research was implemented to investigate therapeutic effect of semi-automated arthroscopic diagnosis and treatment for knee joint. We used the Stryker arthroscopic in surgery that is 4 mm in diameter with angle at 30°. Patients were chosen by biomechanical analysis and scanning mode. All of the patients underwent radiographic imaging examination, Magnetic Resonance Imaging (MRI), Lysholm Score and Visual Analogue Scale (VAS). Arthroscopic limited synovectomy was carried out on these patients. The wound of all patients healed up. The content of follow-up includes: chief complaints, range of motion of knee joint, Visual Analogue Scale (VAS) and Lysholm score. No swollen nor effusion of the infected knee was found in all patients during the follow-up. The postoperative symptom was markedly alleviated in fourteen patients and partially alleviated in one. All patients were satisfied with the therapeutic effect. We performed biomechanical analysis based on knee slight flexion and extension. Arthroscopy is an endoscope for the diagnosis and treatment of joint diseases. Semi-automated arthroscopic debridement is good for early and mid-term osteoarthritis with Lipoma arborescens. Copyright © 2018. Published by Elsevier Ltd.

  20. Multilevel joint competing risk models

    NASA Astrophysics Data System (ADS)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  1. Finite element normal mode analysis of resistance welding jointed of dissimilar plate hat structure

    NASA Astrophysics Data System (ADS)

    Nazri, N. A.; Sani, M. S. M.

    2017-10-01

    Structural joints offer connection between structural element (beam, plate, solid etc.) in order to build a whole assembled structure. The complex behaviour of connecting elements plays a valuable role in characteristics of dynamic such as natural frequencies and mode shapes. In automotive structures, the trustworthiness arrangement of the structure extremely depends on joints. In this paper, top hat structure is modelled and designed with spot welding joint using dissimilar materials which is mild steel 1010 and stainless steel 304, using finite element software. Different types of connector elements such as rigid body element (RBE2), welding joint element (CWELD), and bar element (CBAR) are applied to represent real connection between two dissimilar plates. Normal mode analysis is simulated with different types of joining element in order to determine modal properties. Natural frequencies using RBE2, CBAR and CWELD are compared to equivalent rigid body method. Connection that gives the lowest percentage error among these three will be selected as the most reliable joining for resistance spot weld. From the analysis, it is shown that CWELD is better compared to others in term of weld joining among dissimilar plate materials. It is expected that joint modelling of finite element plays significant role in structural dynamics.

  2. Performance and sensitivity analysis of the generalized likelihood ratio method for failure detection. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Bueno, R. A.

    1977-01-01

    Results of the generalized likelihood ratio (GLR) technique for the detection of failures in aircraft application are presented, and its relationship to the properties of the Kalman-Bucy filter is examined. Under the assumption that the system is perfectly modeled, the detectability and distinguishability of four failure types are investigated by means of analysis and simulations. Detection of failures is found satisfactory, but problems in identifying correctly the mode of a failure may arise. These issues are closely examined as well as the sensitivity of GLR to modeling errors. The advantages and disadvantages of this technique are discussed, and various modifications are suggested to reduce its limitations in performance and computational complexity.

  3. Failure Analysis in Space: International Space Station (ISS) Starboard Solar Alpha Rotary Joint (SARJ) Debris Analysis

    NASA Technical Reports Server (NTRS)

    Long, V. S.; Wright, M. C.; McDanels, S. J.; Lubas, D.; Tucker, B.; Marciniak, P. J.

    2010-01-01

    This slide presentation reviews the debris analysis of the Starboard Solar Alpha Rotary Joint (SARJ), a mechanism that is designed to keep the solar arrays facing the sun. The goal of this was to identify the failure mechanism based on surface morphology and to determine the source of debris through elemental and particle analysis.

  4. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    PubMed

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Partially linear mixed-effects joint models for skewed and missing longitudinal competing risks outcomes.

    PubMed

    Lu, Tao; Lu, Minggen; Wang, Min; Zhang, Jun; Dong, Guang-Hui; Xu, Yong

    2017-12-18

    Longitudinal competing risks data frequently arise in clinical studies. Skewness and missingness are commonly observed for these data in practice. However, most joint models do not account for these data features. In this article, we propose partially linear mixed-effects joint models to analyze skew longitudinal competing risks data with missingness. In particular, to account for skewness, we replace the commonly assumed symmetric distributions by asymmetric distribution for model errors. To deal with missingness, we employ an informative missing data model. The joint models that couple the partially linear mixed-effects model for the longitudinal process, the cause-specific proportional hazard model for competing risks process and missing data process are developed. To estimate the parameters in the joint models, we propose a fully Bayesian approach based on the joint likelihood. To illustrate the proposed model and method, we implement them to an AIDS clinical study. Some interesting findings are reported. We also conduct simulation studies to validate the proposed method.

  6. Bounded influence function based inference in joint modelling of ordinal partial linear model and accelerated failure time model.

    PubMed

    Chakraborty, Arindom

    2016-12-01

    A common objective in longitudinal studies is to characterize the relationship between a longitudinal response process and a time-to-event data. Ordinal nature of the response and possible missing information on covariates add complications to the joint model. In such circumstances, some influential observations often present in the data may upset the analysis. In this paper, a joint model based on ordinal partial mixed model and an accelerated failure time model is used, to account for the repeated ordered response and time-to-event data, respectively. Here, we propose an influence function-based robust estimation method. Monte Carlo expectation maximization method-based algorithm is used for parameter estimation. A detailed simulation study has been done to evaluate the performance of the proposed method. As an application, a data on muscular dystrophy among children is used. Robust estimates are then compared with classical maximum likelihood estimates. © The Author(s) 2014.

  7. Maximum-Likelihood Detection Of Noncoherent CPM

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  8. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    PubMed

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  9. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.

    PubMed

    Karabatsos, George

    2018-06-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.

  10. The Dud-Alternative Effect in Likelihood Judgment

    ERIC Educational Resources Information Center

    Windschitl, Paul D.; Chambers, John R.

    2004-01-01

    The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged…

  11. Likelihood Ratio Tests for Special Rasch Models

    ERIC Educational Resources Information Center

    Hessen, David J.

    2010-01-01

    In this article, a general class of special Rasch models for dichotomous item scores is considered. Although Andersen's likelihood ratio test can be used to test whether a Rasch model fits to the data, the test does not differentiate between special Rasch models. Therefore, in this article, new likelihood ratio tests are proposed for testing…

  12. Joint sparsity based heterogeneous data-level fusion for target detection and estimation

    NASA Astrophysics Data System (ADS)

    Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe

    2017-05-01

    Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.

  13. Sustainability likelihood of remediation options for metal-contaminated soil/sediment.

    PubMed

    Chen, Season S; Taylor, Jessica S; Baek, Kitae; Khan, Eakalak; Tsang, Daniel C W; Ok, Yong Sik

    2017-05-01

    Multi-criteria analysis and detailed impact analysis were carried out to assess the sustainability of four remedial alternatives for metal-contaminated soil/sediment at former timber treatment sites and harbour sediment with different scales. The sustainability was evaluated in the aspects of human health and safety, environment, stakeholder concern, and land use, under four different scenarios with varying weighting factors. The Monte Carlo simulation was performed to reveal the likelihood of accomplishing sustainable remediation with different treatment options at different sites. The results showed that in-situ remedial technologies were more sustainable than ex-situ ones, where in-situ containment demonstrated both the most sustainable result and the highest probability to achieve sustainability amongst the four remedial alternatives in this study, reflecting the lesser extent of off-site and on-site impacts. Concerns associated with ex-situ options were adverse impacts tied to all four aspects and caused by excavation, extraction, and off-site disposal. The results of this study suggested the importance of considering the uncertainties resulting from the remedial options (i.e., stochastic analysis) in addition to the overall sustainability scores (i.e., deterministic analysis). The developed framework and model simulation could serve as an assessment for the sustainability likelihood of remedial options to ensure sustainable remediation of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Population Synthesis of Radio and Gamma-ray Pulsars using the Maximum Likelihood Approach

    NASA Astrophysics Data System (ADS)

    Billman, Caleb; Gonthier, P. L.; Harding, A. K.

    2012-01-01

    We present the results of a pulsar population synthesis of normal pulsars from the Galactic disk using a maximum likelihood method. We seek to maximize the likelihood of a set of parameters in a Monte Carlo population statistics code to better understand their uncertainties and the confidence region of the model's parameter space. The maximum likelihood method allows for the use of more applicable Poisson statistics in the comparison of distributions of small numbers of detected gamma-ray and radio pulsars. Our code simulates pulsars at birth using Monte Carlo techniques and evolves them to the present assuming initial spatial, kick velocity, magnetic field, and period distributions. Pulsars are spun down to the present and given radio and gamma-ray emission characteristics. We select measured distributions of radio pulsars from the Parkes Multibeam survey and Fermi gamma-ray pulsars to perform a likelihood analysis of the assumed model parameters such as initial period and magnetic field, and radio luminosity. We present the results of a grid search of the parameter space as well as a search for the maximum likelihood using a Markov Chain Monte Carlo method. We express our gratitude for the generous support of the Michigan Space Grant Consortium, of the National Science Foundation (REU and RUI), the NASA Astrophysics Theory and Fundamental Program and the NASA Fermi Guest Investigator Program.

  15. Two-phase designs for joint quantitative-trait-dependent and genotype-dependent sampling in post-GWAS regional sequencing.

    PubMed

    Espin-Garcia, Osvaldo; Craiu, Radu V; Bull, Shelley B

    2018-02-01

    We evaluate two-phase designs to follow-up findings from genome-wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation-maximization-based inference under a semiparametric maximum likelihood formulation tailored for post-GWAS inference. A GWAS-SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT-SNP-dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme-QT strata yields significant power improvements compared to marginal QT- or SNP-based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure. © 2017 The Authors. Genetic Epidemiology Published by Wiley Periodicals, Inc.

  16. Electromigration analysis of solder joints under ac load: A mean time to failure model

    NASA Astrophysics Data System (ADS)

    Yao, Wei; Basaran, Cemal

    2012-03-01

    In this study, alternating current (ac) electromigration (EM) degradation simulations were carried out for Sn95.5%Ag4.0%Cu0.5 (SAC405- by weight) solder joints. Mass transport analysis was conducted with viscoplastic material properties for quantifying damage mechanism in solder joints. Square, sine, and triangle current wave forms ac were used as input signals. dc and pulsed dc (PDC) electromigration analysis were conducted for comparison purposes. The maximum current density ranged from 2.2×106A/cm2 to 5.0×106A/cm2, frequency ranged from 0.05 Hz to 5 Hz with ambient temperature varying from 350 K to 450 K. Because the room temperature is nearly two-thirds of SAC solder joint's melting point on absolute temperature scale (494.15 K), viscoplastic material model is essential. Entropy based damage evolution model was used to investigate mean time to failure (MTF) behavior of solder joints subjected to ac stressing. It was observed that MTF was inversely proportional to ambient temperature T1.1 in Celsius and also inversely proportional to current density j0.27 in A/cm2. Higher frequency will lead to a shorter lifetime with in the frequency range we studied, and a relationship is proposed as MTF∝f-0.41. Lifetime of a solder joint subjected to ac is longer compared with dc and PDC loading conditions. By introducing frequency, ambient temperature and current density dependency terms, a modified MTTF equation was proposed for solder joints subjected to ac current stressing.

  17. Vector quantizer designs for joint compression and terrain categorization of multispectral imagery

    NASA Technical Reports Server (NTRS)

    Gorman, John D.; Lyons, Daniel F.

    1994-01-01

    Two vector quantizer designs for compression of multispectral imagery and their impact on terrain categorization performance are evaluated. The mean-squared error (MSE) and classification performance of the two quantizers are compared, and it is shown that a simple two-stage design minimizing MSE subject to a constraint on classification performance has a significantly better classification performance than a standard MSE-based tree-structured vector quantizer followed by maximum likelihood classification. This improvement in classification performance is obtained with minimal loss in MSE performance. The results show that it is advantageous to tailor compression algorithm designs to the required data exploitation tasks. Applications of joint compression/classification include compression for the archival or transmission of Landsat imagery that is later used for land utility surveys and/or radiometric analysis.

  18. Global-Local Finite Element Analysis for Thermo-Mechanical Stresses in Bonded Joints

    NASA Technical Reports Server (NTRS)

    Shkarayev, S.; Madenci, Erdogan; Camarda, C. J.

    1997-01-01

    An analysis of adhesively bonded joints using conventional finite elements does not capture the singular behavior of the stress field in regions where two or three dissimilar materials form a junction with or without free edges. However, these regions are characteristic of the bonded joints and are prone to failure initiation. This study presents a method to capture the singular stress field arising from the geometric and material discontinuities in bonded composites. It is achieved by coupling the local (conventional) elements with global (special) elements whose interpolation functions are constructed from the asymptotic solution.

  19. Analysis of EDZ Development of Columnar Jointed Rock Mass in the Baihetan Diversion Tunnel

    NASA Astrophysics Data System (ADS)

    Hao, Xian-Jie; Feng, Xia-Ting; Yang, Cheng-Xiang; Jiang, Quan; Li, Shao-Jun

    2016-04-01

    Due to the time dependency of the crack propagation, columnar jointed rock masses exhibit marked time-dependent behaviour. In this study, in situ measurements, scanning electron microscope (SEM), back-analysis method and numerical simulations are presented to study the time-dependent development of the excavation damaged zone (EDZ) around underground diversion tunnels in a columnar jointed rock mass. Through in situ measurements of crack propagation and EDZ development, their extent is seen to have increased over time, despite the fact that the advancing face has passed. Similar to creep behaviour, the time-dependent EDZ development curve also consists of three stages: a deceleration stage, a stabilization stage, and an acceleration stage. A corresponding constitutive model of columnar jointed rock mass considering time-dependent behaviour is proposed. The time-dependent degradation coefficient of the roughness coefficient and residual friction angle in the Barton-Bandis strength criterion are taken into account. An intelligent back-analysis method is adopted to obtain the unknown time-dependent degradation coefficients for the proposed constitutive model. The numerical modelling results are in good agreement with the measured EDZ. Not only that, the failure pattern simulated by this time-dependent constitutive model is consistent with that observed in the scanning electron microscope (SEM) and in situ observation, indicating that this model could accurately simulate the failure pattern and time-dependent EDZ development of columnar joints. Moreover, the effects of the support system provided and the in situ stress on the time-dependent coefficients are studied. Finally, the long-term stability analysis of diversion tunnels excavated in columnar jointed rock masses is performed.

  20. Joint Symbol Timing and CFO Estimation for OFDM/OQAM Systems in Multipath Channels

    NASA Astrophysics Data System (ADS)

    Fusco, Tilde; Petrella, Angelo; Tanda, Mario

    2009-12-01

    The problem of data-aided synchronization for orthogonal frequency division multiplexing (OFDM) systems based on offset quadrature amplitude modulation (OQAM) in multipath channels is considered. In particular, the joint maximum-likelihood (ML) estimator for carrier-frequency offset (CFO), amplitudes, phases, and delays, exploiting a short known preamble, is derived. The ML estimators for phases and amplitudes are in closed form. Moreover, under the assumption that the CFO is sufficiently small, a closed form approximate ML (AML) CFO estimator is obtained. By exploiting the obtained closed form solutions a cost function whose peaks provide an estimate of the delays is derived. In particular, the symbol timing (i.e., the delay of the first multipath component) is obtained by considering the smallest estimated delay. The performance of the proposed joint AML estimator is assessed via computer simulations and compared with that achieved by the joint AML estimator designed for AWGN channel and that achieved by a previously derived joint estimator for OFDM systems.

  1. Fractal Analysis of Rock Joint Profiles

    NASA Astrophysics Data System (ADS)

    Audy, Ondřej; Ficker, Tomáš

    2017-10-01

    Surface reliefs of rock joints are analyzed in geotechnics when shear strength of rocky slopes is estimated. The rock joint profiles actually are self-affine fractal curves and computations of their fractal dimensions require special methods. Many papers devoted to the fractal properties of these profiles were published in the past but only a few of those papers employed a convenient computational method that would have guaranteed a sound value of that dimension. As a consequence, anomalously low dimensions were presented. This contribution deals with two computational modifications that lead to sound fractal dimensions of the self-affine rock joint profiles. These are the modified box-counting method and the modified yard-stick method sometimes called the compass method. Both these methods are frequently applied to self-similar fractal curves but the self-affine profile curves due to their self-affine nature require modified computational procedures implemented in computer programs.

  2. Joint independent component analysis for simultaneous EEG-fMRI: principle and simulation.

    PubMed

    Moosmann, Matthias; Eichele, Tom; Nordby, Helge; Hugdahl, Kenneth; Calhoun, Vince D

    2008-03-01

    An optimized scheme for the fusion of electroencephalography and event related potentials with functional magnetic resonance imaging (BOLD-fMRI) data should simultaneously assess all available electrophysiologic and hemodynamic information in a common data space. In doing so, it should be possible to identify features of latent neural sources whose trial-to-trial dynamics are jointly reflected in both modalities. We present a joint independent component analysis (jICA) model for analysis of simultaneous single trial EEG-fMRI measurements from multiple subjects. We outline the general idea underlying the jICA approach and present results from simulated data under realistic noise conditions. Our results indicate that this approach is a feasible and physiologically plausible data-driven way to achieve spatiotemporal mapping of event related responses in the human brain.

  3. Generalizing Terwilliger's likelihood approach: a new score statistic to test for genetic association.

    PubMed

    el Galta, Rachid; Uitte de Willige, Shirley; de Visser, Marieke C H; Helmer, Quinta; Hsu, Li; Houwing-Duistermaat, Jeanine J

    2007-09-24

    In this paper, we propose a one degree of freedom test for association between a candidate gene and a binary trait. This method is a generalization of Terwilliger's likelihood ratio statistic and is especially powerful for the situation of one associated haplotype. As an alternative to the likelihood ratio statistic, we derive a score statistic, which has a tractable expression. For haplotype analysis, we assume that phase is known. By means of a simulation study, we compare the performance of the score statistic to Pearson's chi-square statistic and the likelihood ratio statistic proposed by Terwilliger. We illustrate the method on three candidate genes studied in the Leiden Thrombophilia Study. We conclude that the statistic follows a chi square distribution under the null hypothesis and that the score statistic is more powerful than Terwilliger's likelihood ratio statistic when the associated haplotype has frequency between 0.1 and 0.4 and has a small impact on the studied disorder. With regard to Pearson's chi-square statistic, the score statistic has more power when the associated haplotype has frequency above 0.2 and the number of variants is above five.

  4. Mechanical performance and parameter sensitivity analysis of 3D braided composites joints.

    PubMed

    Wu, Yue; Nan, Bo; Chen, Liang

    2014-01-01

    3D braided composite joints are the important components in CFRP truss, which have significant influence on the reliability and lightweight of structures. To investigate the mechanical performance of 3D braided composite joints, a numerical method based on the microscopic mechanics is put forward, the modeling technologies, including the material constants selection, element type, grid size, and the boundary conditions, are discussed in detail. Secondly, a method for determination of ultimate bearing capacity is established, which can consider the strength failure. Finally, the effect of load parameters, geometric parameters, and process parameters on the ultimate bearing capacity of joints is analyzed by the global sensitivity analysis method. The results show that the main pipe diameter thickness ratio γ, the main pipe diameter D, and the braided angle α are sensitive to the ultimate bearing capacity N.

  5. Multiple robustness in factorized likelihood models.

    PubMed

    Molina, J; Rotnitzky, A; Sued, M; Robins, J M

    2017-09-01

    We consider inference under a nonparametric or semiparametric model with likelihood that factorizes as the product of two or more variation-independent factors. We are interested in a finite-dimensional parameter that depends on only one of the likelihood factors and whose estimation requires the auxiliary estimation of one or several nuisance functions. We investigate general structures conducive to the construction of so-called multiply robust estimating functions, whose computation requires postulating several dimension-reducing models but which have mean zero at the true parameter value provided one of these models is correct.

  6. A maximum likelihood map of chromosome 1.

    PubMed Central

    Rao, D C; Keats, B J; Lalouel, J M; Morton, N E; Yee, S

    1979-01-01

    Thirteen loci are mapped on chromosome 1 from genetic evidence. The maximum likelihood map presented permits confirmation that Scianna (SC) and a fourteenth locus, phenylketonuria (PKU), are on chromosome 1, although the location of the latter on the PGM1-AMY segment is uncertain. Eight other controversial genetic assignments are rejected, providing a practical demonstration of the resolution which maximum likelihood theory brings to mapping. PMID:293128

  7. Event-related fMRI studies of false memory: An Activation Likelihood Estimation meta-analysis.

    PubMed

    Kurkela, Kyle A; Dennis, Nancy A

    2016-01-29

    Over the last two decades, a wealth of research in the domain of episodic memory has focused on understanding the neural correlates mediating false memories, or memories for events that never happened. While several recent qualitative reviews have attempted to synthesize this literature, methodological differences amongst the empirical studies and a focus on only a sub-set of the findings has limited broader conclusions regarding the neural mechanisms underlying false memories. The current study performed a voxel-wise quantitative meta-analysis using activation likelihood estimation to investigate commonalities within the functional magnetic resonance imaging (fMRI) literature studying false memory. The results were broken down by memory phase (encoding, retrieval), as well as sub-analyses looking at differences in baseline (hit, correct rejection), memoranda (verbal, semantic), and experimental paradigm (e.g., semantic relatedness and perceptual relatedness) within retrieval. Concordance maps identified significant overlap across studies for each analysis. Several regions were identified in the general false retrieval analysis as well as multiple sub-analyses, indicating their ubiquitous, yet critical role in false retrieval (medial superior frontal gyrus, left precentral gyrus, left inferior parietal cortex). Additionally, several regions showed baseline- and paradigm-specific effects (hit/perceptual relatedness: inferior and middle occipital gyrus; CRs: bilateral inferior parietal cortex, precuneus, left caudate). With respect to encoding, analyses showed common activity in the left middle temporal gyrus and anterior cingulate cortex. No analysis identified a common cluster of activation in the medial temporal lobe. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Analysis of Commercially Available Firefighting Helmet and Boot Options for the Joint Firefighter Integrated Response Ensemble (JFIRE)

    DTIC Science & Technology

    2012-02-01

    AFRL-RX-TY-TR-2012-0022 ANALYSIS OF COMMERCIALLY AVAILABLE HELMET AND BOOT OPTIONS FOR THE JOINT FIREFIGHTER INTEGRATED RESPONSE ENSEMBLE...Interim Technical Report 01-SEP-2010 -- 31-JAN-2011 Analysis of Commercially Available Firefighting Helmet and Boot Options for the Joint Firefighter...ensemble. A requirements correlation matrix was generated and sent to industry detailing objective and threshold measurements for both the helmet

  9. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    NASA Astrophysics Data System (ADS)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  10. Maximal likelihood correspondence estimation for face recognition across pose.

    PubMed

    Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang

    2014-10-01

    Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database.

  11. Inferior or double joint spaces injection versus superior joint space injection for temporomandibular disorders: a systematic review and meta-analysis.

    PubMed

    Li, Chunjie; Zhang, Yifan; Lv, Jun; Shi, Zongdao

    2012-01-01

    To compare the effect and safety of inferior or double temporomandibular joint spaces drug injection versus superior temporomandibular joint space injection in the treatment of temporomandibular disorders. MEDLINE (via Ovid, 1948 to March 2011), CENTRAL (Issue 1, 2011), Embase (1984 to March 2011), CBM (1978 to March 2011), and World Health Organization International Clinical Trials Registry Platform were searched electronically; relevant journals as well as references of included studies were hand-searched for randomized controlled trials comparing effect or safety of inferior or double joint spaces drug injection technique with those of superior space injection technique. Risk of bias assessment with the tool recommended by Cochrane Collaboration, reporting quality assessment with CONSORT and data extraction, were carried out independently by 2 reviewers. Meta-analysis was delivered with RevMan 5.0.23. Four trials with 349 participants were included. All the included studies had moderate risk of bias. Meta-analysis showed that inferior or double spaces injection technique could significantly increase 2.88 mm more maximal mouth opening (P = .0001) and alleviate pain intensity in the temporomandibular area on average by 9.01 mm visual analog scale scores (P = .0001) compared with superior space injection technique, but could not markedly change synthesized clinical index (P = .05) in the short term; nevertheless, they showed more beneficial maximal mouth opening (P = .002), pain relief (P < .0001), and synthesized clinical variable (P < .0001) in the long term than superior space injection. No serious adverse events were reported. Inferior or double temporomandibular joint spaces drug injection technique shows better effect than superior space injection technique, and their safety is affirmative. However, more high-quality studies are still needed to test and verify the evidence. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  12. Algorithms of maximum likelihood data clustering with applications

    NASA Astrophysics Data System (ADS)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  13. Likelihood-based modification of experimental crystal structure electron density maps

    DOEpatents

    Terwilliger, Thomas C [Sante Fe, NM

    2005-04-16

    A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.

  14. Effect of radiance-to-reflectance transformation and atmosphere removal on maximum likelihood classification accuracy of high-dimensional remote sensing data

    NASA Technical Reports Server (NTRS)

    Hoffbeck, Joseph P.; Landgrebe, David A.

    1994-01-01

    Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum likelihood classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpful. The effect of several radiance-to-reflectance transformations on maximum likelihood classification accuracy is investigated in this paper. We show that the empirical line approach, LOWTRAN7, flat-field correction, single spectrum method, and internal average reflectance are all non-singular affine transformations, and that non-singular affine transformations have no effect on discriminant analysis feature extraction and maximum likelihood classification accuracy. (An affine transformation is a linear transformation with an optional offset.) Since the Atmosphere Removal Program (ATREM) and the log residue method are not affine transformations, experiments with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were conducted to determine the effect of these transformations on maximum likelihood classification accuracy. The average classification accuracy of the data transformed by ATREM and the log residue method was slightly less than the accuracy of the original radiance data. Since the radiance-to-reflectance transformations allow direct comparison of remotely sensed spectra with laboratory reflectance spectra, they can be quite useful in labeling the training samples required by maximum likelihood classification, but these transformations have only a slight effect or no effect at all on discriminant analysis and maximum likelihood classification accuracy.

  15. Joint Procrustes Analysis for Simultaneous Nonsingular Transformation of Component Score and Loading Matrices

    ERIC Educational Resources Information Center

    Adachi, Kohei

    2009-01-01

    In component analysis solutions, post-multiplying a component score matrix by a nonsingular matrix can be compensated by applying its inverse to the corresponding loading matrix. To eliminate this indeterminacy on nonsingular transformation, we propose Joint Procrustes Analysis (JPA) in which component score and loading matrices are simultaneously…

  16. The double universal joint wrist on a manipulator: Solution of inverse position kinematics and singularity analysis

    NASA Technical Reports Server (NTRS)

    Williams, Robert L., III

    1992-01-01

    This paper presents three methods to solve the inverse position kinematics position problem of the double universal joint attached to a manipulator: (1) an analytical solution for two specific cases; (2) an approximate closed form solution based on ignoring the wrist offset; and (3) an iterative method which repeats closed form position and orientation calculations until the solution is achieved. Several manipulators are used to demonstrate the solution methods: cartesian, cylindrical, spherical, and an anthropomorphic articulated arm, based on the Flight Telerobotic Servicer (FTS) arm. A singularity analysis is presented for the double universal joint wrist attached to the above manipulator arms. While the double universal joint wrist standing alone is singularity-free in orientation, the singularity analysis indicates the presence of coupled position/orientation singularities of the spherical and articulated manipulators with the wrist. The cartesian and cylindrical manipulators with the double universal joint wrist were found to be singularity-free. The methods of this paper can be implemented in a real-time controller for manipulators with the double universal joint wrist. Such mechanically dextrous systems could be used in telerobotic and industrial applications, but further work is required to avoid the singularities.

  17. Hypnosis and pain perception: An Activation Likelihood Estimation (ALE) meta-analysis of functional neuroimaging studies.

    PubMed

    Del Casale, Antonio; Ferracuti, Stefano; Rapinesi, Chiara; De Rossi, Pietro; Angeletti, Gloria; Sani, Gabriele; Kotzalidis, Georgios D; Girardi, Paolo

    2015-12-01

    Several studies reported that hypnosis can modulate pain perception and tolerance by affecting cortical and subcortical activity in brain regions involved in these processes. We conducted an Activation Likelihood Estimation (ALE) meta-analysis on functional neuroimaging studies of pain perception under hypnosis to identify brain activation-deactivation patterns occurring during hypnotic suggestions aiming at pain reduction, including hypnotic analgesic, pleasant, or depersonalization suggestions (HASs). We searched the PubMed, Embase and PsycInfo databases; we included papers published in peer-reviewed journals dealing with functional neuroimaging and hypnosis-modulated pain perception. The ALE meta-analysis encompassed data from 75 healthy volunteers reported in 8 functional neuroimaging studies. HASs during experimentally-induced pain compared to control conditions correlated with significant activations of the right anterior cingulate cortex (Brodmann's Area [BA] 32), left superior frontal gyrus (BA 6), and right insula, and deactivation of right midline nuclei of the thalamus. HASs during experimental pain impact both cortical and subcortical brain activity. The anterior cingulate, left superior frontal, and right insular cortices activation increases could induce a thalamic deactivation (top-down inhibition), which may correlate with reductions in pain intensity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Test and analysis of Celion 3000/PMR-15, graphite/polyimide bonded composite joints: Data report

    NASA Technical Reports Server (NTRS)

    Cushman, J. B.; Mccleskey, S. F.; Ward, S. H.

    1982-01-01

    Standard single lap, double lap and symmetric step lap bonded joints of Celion 3000/PMR-15 graphite/polyimide composite were evaluated. Composite to composite and composite to titanium joints were tested at 116 K (-250 F), 294 K (70 F) and 561 K (550 F). Joint parameters evaluated are lap length, adherend thickness, adherend axial stiffness, lamina stacking sequence and adherend tapering. Advanced joint concepts were examined to establish the change in performance of preformed adherends, scalloped adherends and hybrid systems. The material properties of the high temperature adhesive, designated A7F, used for bonding were established. The bonded joint tests resulted in interlaminar shear or peel failures of the composite and there were very few adhesive failures. Average test results agree with expected performance trends for the various test parameters. Results of finite element analyses and of test/analysis correlations are also presented.

  19. Determination of Parachute Joint Factors using Seam and Joint Testing

    NASA Technical Reports Server (NTRS)

    Mollmann, Catherine

    2015-01-01

    This paper details the methodology for determining the joint factor for all parachute components. This method has been successfully implemented on the Capsule Parachute Assembly System (CPAS) for the NASA Orion crew module for use in determining the margin of safety for each component under peak loads. Also discussed are concepts behind the joint factor and what drives the loss of material strength at joints. The joint factor is defined as a "loss in joint strength...relative to the basic material strength" that occurs when "textiles are connected to each other or to metals." During the CPAS engineering development phase, a conservative joint factor of 0.80 was assumed for each parachute component. In order to refine this factor and eliminate excess conservatism, a seam and joint testing program was implemented as part of the structural validation. This method split each of the parachute structural joints into discrete tensile tests designed to duplicate the loading of each joint. Breaking strength data collected from destructive pull testing was then used to calculate the joint factor in the form of an efficiency. Joint efficiency is the percentage of the base material strength that remains after degradation due to sewing or interaction with other components; it is used interchangeably with joint factor in this paper. Parachute materials vary in type-mainly cord, tape, webbing, and cloth -which require different test fixtures and joint sample construction methods. This paper defines guidelines for designing and testing samples based on materials and test goals. Using the test methodology and analysis approach detailed in this paper, the minimum joint factor for each parachute component can be formulated. The joint factors can then be used to calculate the design factor and margin of safety for that component, a critical part of the design verification process.

  20. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics.

    PubMed

    Arampatzis, Georgios; Katsoulakis, Markos A; Rey-Bellet, Luc

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  1. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2016-03-01

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  2. Additive hazards regression and partial likelihood estimation for ecological monitoring data across space.

    PubMed

    Lin, Feng-Chang; Zhu, Jun

    2012-01-01

    We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.

  3. Expressed Likelihood as Motivator: Creating Value through Engaging What’s Real

    PubMed Central

    Higgins, E. Tory; Franks, Becca; Pavarini, Dana; Sehnert, Steen; Manley, Katie

    2012-01-01

    Our research tested two predictions regarding how likelihood can have motivational effects as a function of how a probability is expressed. We predicted that describing the probability of a future event that could be either A or B using the language of high likelihood (“80% A”) rather than low likelihood (“20% B”), i.e., high rather than low expressed likelihood, would make a present activity more real and engaging, as long as the future event had properties relevant to the present activity. We also predicted that strengthening engagement from the high (vs. low) expressed likelihood of a future event would intensify the value of present positive and negative objects (in opposite directions). Both predictions were supported. There was also evidence that this intensification effect from expressed likelihood was independent of the actual probability or valence of the future event. What mattered was whether high versus low likelihood language was used to describe the future event. PMID:23940411

  4. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  5. Maximum Likelihood Analysis of a Two-Level Nonlinear Structural Equation Model with Fixed Covariates

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Song, Xin-Yuan

    2005-01-01

    In this article, a maximum likelihood (ML) approach for analyzing a rather general two-level structural equation model is developed for hierarchically structured data that are very common in educational and/or behavioral research. The proposed two-level model can accommodate nonlinear causal relations among latent variables as well as effects…

  6. CFHTLenS: a Gaussian likelihood is a sufficient approximation for a cosmological analysis of third-order cosmic shear statistics

    NASA Astrophysics Data System (ADS)

    Simon, P.; Semboloni, E.; van Waerbeke, L.; Hoekstra, H.; Erben, T.; Fu, L.; Harnois-Déraps, J.; Heymans, C.; Hildebrandt, H.; Kilbinger, M.; Kitching, T. D.; Miller, L.; Schrabback, T.

    2015-05-01

    We study the correlations of the shear signal between triplets of sources in the Canada-France-Hawaii Telescope Lensing Survey (CFHTLenS) to probe cosmological parameters via the matter bispectrum. In contrast to previous studies, we adopt a non-Gaussian model of the data likelihood which is supported by our simulations of the survey. We find that for state-of-the-art surveys, similar to CFHTLenS, a Gaussian likelihood analysis is a reasonable approximation, albeit small differences in the parameter constraints are already visible. For future surveys we expect that a Gaussian model becomes inaccurate. Our algorithm for a refined non-Gaussian analysis and data compression is then of great utility especially because it is not much more elaborate if simulated data are available. Applying this algorithm to the third-order correlations of shear alone in a blind analysis, we find a good agreement with the standard cosmological model: Σ _8=σ _8(Ω _m/0.27)^{0.64}=0.79^{+0.08}_{-0.11} for a flat Λ cold dark matter cosmology with h = 0.7 ± 0.04 (68 per cent credible interval). Nevertheless our models provide only moderately good fits as indicated by χ2/dof = 2.9, including a 20 per cent rms uncertainty in the predicted signal amplitude. The models cannot explain a signal drop on scales around 15 arcmin, which may be caused by systematics. It is unclear whether the discrepancy can be fully explained by residual point spread function systematics of which we find evidence at least on scales of a few arcmin. Therefore we need a better understanding of higher order correlations of cosmic shear and their systematics to confidently apply them as cosmological probes.

  7. The implementation, interpretation, and justification of likelihoods in cosmology

    NASA Astrophysics Data System (ADS)

    McCoy, C. D.

    2018-05-01

    I discuss the formal implementation, interpretation, and justification of likelihood attributions in cosmology. I show that likelihood arguments in cosmology suffer from significant conceptual and formal problems that undermine their applicability in this context.

  8. Joint Inference of Population Assignment and Demographic History

    PubMed Central

    Choi, Sang Chul; Hey, Jody

    2011-01-01

    A new approach to assigning individuals to populations using genetic data is described. Most existing methods work by maximizing Hardy–Weinberg and linkage equilibrium within populations, neither of which will apply for many demographic histories. By including a demographic model, within a likelihood framework based on coalescent theory, we can jointly study demographic history and population assignment. Genealogies and population assignments are sampled from a posterior distribution using a general isolation-with-migration model for multiple populations. A measure of partition distance between assignments facilitates not only the summary of a posterior sample of assignments, but also the estimation of the posterior density for the demographic history. It is shown that joint estimates of assignment and demographic history are possible, including estimation of population phylogeny for samples from three populations. The new method is compared to results of a widely used assignment method, using simulated and published empirical data sets. PMID:21775468

  9. A New Monte Carlo Method for Estimating Marginal Likelihoods.

    PubMed

    Wang, Yu-Bo; Chen, Ming-Hui; Kuo, Lynn; Lewis, Paul O

    2018-06-01

    Evaluating the marginal likelihood in Bayesian analysis is essential for model selection. Estimators based on a single Markov chain Monte Carlo sample from the posterior distribution include the harmonic mean estimator and the inflated density ratio estimator. We propose a new class of Monte Carlo estimators based on this single Markov chain Monte Carlo sample. This class can be thought of as a generalization of the harmonic mean and inflated density ratio estimators using a partition weighted kernel (likelihood times prior). We show that our estimator is consistent and has better theoretical properties than the harmonic mean and inflated density ratio estimators. In addition, we provide guidelines on choosing optimal weights. Simulation studies were conducted to examine the empirical performance of the proposed estimator. We further demonstrate the desirable features of the proposed estimator with two real data sets: one is from a prostate cancer study using an ordinal probit regression model with latent variables; the other is for the power prior construction from two Eastern Cooperative Oncology Group phase III clinical trials using the cure rate survival model with similar objectives.

  10. Pushover analysis of reinforced concrete frames considering shear failure at beam-column joints

    NASA Astrophysics Data System (ADS)

    Sung, Y. C.; Lin, T. K.; Hsiao, C. C.; Lai, M. C.

    2013-09-01

    Since most current seismic capacity evaluations of reinforced concrete (RC) frame structures are implemented by either static pushover analysis (PA) or dynamic time history analysis, with diverse settings of the plastic hinges (PHs) on such main structural components as columns, beams and walls, the complex behavior of shear failure at beam-column joints (BCJs) during major earthquakes is commonly neglected. This study proposes new nonlinear PA procedures that consider shear failure at BCJs and seek to assess the actual damage to RC structures. Based on the specifications of FEMA-356, a simplified joint model composed of two nonlinear cross struts placed diagonally over the location of the plastic hinge is established, allowing a sophisticated PA to be performed. To verify the validity of this method, the analytical results for the capacity curves and the failure mechanism derived from three different full-size RC frames are compared with the experimental measurements. By considering shear failure at BCJs, the proposed nonlinear analytical procedures can be used to estimate the structural behavior of RC frames, including seismic capacity and the progressive failure sequence of joints, in a precise and effective manner.

  11. A quantum framework for likelihood ratios

    NASA Astrophysics Data System (ADS)

    Bond, Rachael L.; He, Yang-Hui; Ormerod, Thomas C.

    The ability to calculate precise likelihood ratios is fundamental to science, from Quantum Information Theory through to Quantum State Estimation. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes’ theorem either defaults to the marginal probability driven “naive Bayes’ classifier”, or requires the use of compensatory expectation-maximization techniques. This paper takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement, and demonstrates that Bayes’ theorem is a special case of a more general quantum mechanical expression.

  12. In-vivo analysis of ankle joint movement for patient-specific kinematic characterization.

    PubMed

    Ferraresi, Carlo; De Benedictis, Carlo; Franco, Walter; Maffiodo, Daniela; Leardini, Alberto

    2017-09-01

    In this article, a method for the experimental in-vivo characterization of the ankle kinematics is proposed. The method is meant to improve personalization of various ankle joint treatments, such as surgical decision-making or design and application of an orthosis, possibly to increase their effectiveness. This characterization in fact would make the treatments more compatible with the specific patient's joint physiological conditions. This article describes the experimental procedure and the analytical method adopted, based on the instantaneous and mean helical axis theories. The results obtained in this experimental analysis reveal that more accurate techniques are necessary for a robust in-vivo assessment of the tibio-talar axis of rotation.

  13. Approximate likelihood approaches for detecting the influence of primordial gravitational waves in cosmic microwave background polarization

    NASA Astrophysics Data System (ADS)

    Pan, Zhen; Anderes, Ethan; Knox, Lloyd

    2018-05-01

    One of the major targets for next-generation cosmic microwave background (CMB) experiments is the detection of the primordial B-mode signal. Planning is under way for Stage-IV experiments that are projected to have instrumental noise small enough to make lensing and foregrounds the dominant source of uncertainty for estimating the tensor-to-scalar ratio r from polarization maps. This makes delensing a crucial part of future CMB polarization science. In this paper we present a likelihood method for estimating the tensor-to-scalar ratio r from CMB polarization observations, which combines the benefits of a full-scale likelihood approach with the tractability of the quadratic delensing technique. This method is a pixel space, all order likelihood analysis of the quadratic delensed B modes, and it essentially builds upon the quadratic delenser by taking into account all order lensing and pixel space anomalies. Its tractability relies on a crucial factorization of the pixel space covariance matrix of the polarization observations which allows one to compute the full Gaussian approximate likelihood profile, as a function of r , at the same computational cost of a single likelihood evaluation.

  14. Analysis and optimization of the active rigidity joint

    NASA Astrophysics Data System (ADS)

    Manzo, Justin; Garcia, Ephrahim

    2009-12-01

    The active rigidity joint is a composite mechanism using shape memory alloy and shape memory polymer to create a passively rigid joint with thermally activated deflection. A new model for the active rigidity joint relaxes constraints of earlier methods and allows for more accurate deflection predictions compared to finite element results. Using an iterative process to determine the strain distribution and deflection, the method demonstrates accurate results for both surface bonded and embedded actuators with and without external loading. Deflection capabilities are explored through simulated annealing heuristic optimization using a variety of cost functions to explore actuator performance. A family of responses presents actuator characteristics in terms of load bearing and deflection capabilities given material and thermal constraints. Optimization greatly expands the available workspace of the active rigidity joint from the initial configuration, demonstrating specific work capabilities comparable to those of muscle tissue.

  15. Restricted maximum likelihood estimation of genetic principal components and smoothed covariance matrices

    PubMed Central

    Meyer, Karin; Kirkpatrick, Mark

    2005-01-01

    Principal component analysis is a widely used 'dimension reduction' technique, albeit generally at a phenotypic level. It is shown that we can estimate genetic principal components directly through a simple reparameterisation of the usual linear, mixed model. This is applicable to any analysis fitting multiple, correlated genetic effects, whether effects for individual traits or sets of random regression coefficients to model trajectories. Depending on the magnitude of genetic correlation, a subset of the principal component generally suffices to capture the bulk of genetic variation. Corresponding estimates of genetic covariance matrices are more parsimonious, have reduced rank and are smoothed, with the number of parameters required to model the dispersion structure reduced from k(k + 1)/2 to m(2k - m + 1)/2 for k effects and m principal components. Estimation of these parameters, the largest eigenvalues and pertaining eigenvectors of the genetic covariance matrix, via restricted maximum likelihood using derivatives of the likelihood, is described. It is shown that reduced rank estimation can reduce computational requirements of multivariate analyses substantially. An application to the analysis of eight traits recorded via live ultrasound scanning of beef cattle is given. PMID:15588566

  16. The Construct Validity of Higher Order Structure-of-Intellect Abilities in a Battery of Tests Emphasizing the Product of Transformations: A Confirmatory Maximum Likelihood Factor Analysis.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; And Others

    1982-01-01

    A causal modeling system, using confirmatory maximum likelihood factor analysis with the LISREL IV computer program, evaluated the construct validity underlying the higher order factor structure of a given correlation matrix of 46 structure-of-intellect tests emphasizing the product of transformations. (Author/PN)

  17. Joint Utility of Event-Dependent and Environmental Crime Analysis Techniques for Violent Crime Forecasting

    ERIC Educational Resources Information Center

    Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.

    2013-01-01

    Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…

  18. Investigation of 2-stage meta-analysis methods for joint longitudinal and time-to-event data through simulation and real data application.

    PubMed

    Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi

    2018-04-15

    Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  19. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systemsmore » with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.« less

  20. Likelihood ratios for glaucoma diagnosis using spectral-domain optical coherence tomography.

    PubMed

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M; Weinreb, Robert N; Medeiros, Felipe A

    2013-11-01

    To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral-domain optical coherence tomography (spectral-domain OCT). Observational cohort study. A total of 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the receiver operating characteristic (ROC) curve. Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86 μm were associated with positive likelihood ratios (ie, likelihood ratios greater than 1), whereas RNFL thickness values higher than 86 μm were associated with negative likelihood ratios (ie, likelihood ratios smaller than 1). A modified Fagan nomogram was provided to assist calculation of posttest probability of disease from the calculated likelihood ratios and pretest probability of disease. The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision making. Copyright © 2013. Published by Elsevier Inc.

  1. Deconstructing the power resistance relationship for squats: A joint-level analysis.

    PubMed

    Farris, D J; Lichtwark, G A; Brown, N A T; Cresswell, A G

    2016-07-01

    Generating high leg power outputs is important for executing rapid movements. Squats are commonly used to increase leg strength and power. Therefore, it is useful to understand factors affecting power output in squatting. We aimed to deconstruct the mechanisms behind why power is maximized at certain resistances in squatting. Ten male rowers (age = 20 ± 2.2 years; height = 1.82 ± 0.03 m; mass = 86 ± 11 kg) performed maximal power squats with resistances ranging from body weight to 80% of their one repetition maximum (1RM). Three-dimensional kinematics was combined with ground reaction force (GRF) data in an inverse dynamics analysis to calculate leg joint moments and powers. System center of mass (COM) velocity and power were computed from GRF data. COM power was maximized across a range of resistances from 40% to 60% 1RM. This range was identified because a trade-off in hip and knee joint powers existed across this range, with maximal knee joint power occurring at 40% 1RM and maximal hip joint power at 60% 1RM. A non-linear system force-velocity relationship was observed that dictated large reductions in COM power below 20% 1RM and above 60% 1RM. These reductions were due to constraints on the control of the movement. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Analysis of Interrelationships among Voluntary and Prosthetic Leg Joint Parameters Using Cyclograms.

    PubMed

    Jasni, Farahiyah; Hamzaid, Nur Azah; Mohd Syah, Nor Elleeiana; Chung, Tze Y; Abu Osman, Noor Azuan

    2017-01-01

    The walking mechanism of a prosthetic leg user is a tightly coordinated movement of several joints and limb segments. The interaction among the voluntary and mechanical joints and segments requires particular biomechanical insight. This study aims to analyze the inter-relationship between amputees' voluntary and mechanical coupled leg joints variables using cyclograms. From this analysis, the critical gait parameters in each gait phase were determined and analyzed if they contribute to a better powered prosthetic knee control design. To develop the cyclogram model, 20 healthy able-bodied subjects and 25 prosthesis and orthosis users (10 transtibial amputees, 5 transfemoral amputees, and 10 different pathological profiles of orthosis users) walked at their comfortable speed in a 3D motion analysis lab setting. The gait parameters (i.e., angle, moment and power for the ankle, knee and hip joints) were coupled to form 36 cyclograms relationship. The model was validated by quantifying the gait disparities of all the pathological walking by analyzing each cyclograms pairs using feed-forward neural network with backpropagation. Subsequently, the cyclogram pairs that contributed to the highest gait disparity of each gait phase were manipulated by replacing it with normal values and re-analyzed. The manipulated cyclograms relationship that showed highest improvement in terms of gait disparity calculation suggested that they are the most dominant parameters in powered-knee control. In case of transfemoral amputee walking, it was identified using this approach that at each gait sub-phase, the knee variables most responsible for closest to normal walking were: knee power during loading response and mid-stance, knee moment and knee angle during terminal stance phase, knee angle and knee power during pre-swing, knee angle at initial swing, and knee power at terminal swing. No variable was dominant during mid-swing phase implying natural pendulum effect of the lower limb between

  3. Measuring coherence of computer-assisted likelihood ratio methods.

    PubMed

    Haraksim, Rudolf; Ramos, Daniel; Meuwly, Didier; Berger, Charles E H

    2015-04-01

    Measuring the performance of forensic evaluation methods that compute likelihood ratios (LRs) is relevant for both the development and the validation of such methods. A framework of performance characteristics categorized as primary and secondary is introduced in this study to help achieve such development and validation. Ground-truth labelled fingerprint data is used to assess the performance of an example likelihood ratio method in terms of those performance characteristics. Discrimination, calibration, and especially the coherence of this LR method are assessed as a function of the quantity and quality of the trace fingerprint specimen. Assessment of the coherence revealed a weakness of the comparison algorithm in the computer-assisted likelihood ratio method used. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. An inelastic analysis of a welded aluminum joint

    NASA Astrophysics Data System (ADS)

    Vaughan, Robert E.; Schonberg, William P.

    1995-02-01

    Butt weld joints are most commonly designed into pressure vessels by using weld material properties that are determined from a tensile test. These properties are provided to the stress analyst in the form of a stress vs strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of multiple pass aluminum 2219-T87 butt welds. The weld specimens are analyzed using classical plasticity theory to provide a basis for modeling the inelastic properties in a finite element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of currently available numerical prediction methods.

  5. Quantifying the Establishment Likelihood of Invasive Alien Species Introductions Through Ports with Application to Honeybees in Australia.

    PubMed

    Heersink, Daniel K; Caley, Peter; Paini, Dean R; Barry, Simon C

    2016-05-01

    The cost of an uncontrolled incursion of invasive alien species (IAS) arising from undetected entry through ports can be substantial, and knowledge of port-specific risks is needed to help allocate limited surveillance resources. Quantifying the establishment likelihood of such an incursion requires quantifying the ability of a species to enter, establish, and spread. Estimation of the approach rate of IAS into ports provides a measure of likelihood of entry. Data on the approach rate of IAS are typically sparse, and the combinations of risk factors relating to country of origin and port of arrival diverse. This presents challenges to making formal statistical inference on establishment likelihood. Here we demonstrate how these challenges can be overcome with judicious use of mixed-effects models when estimating the incursion likelihood into Australia of the European (Apis mellifera) and Asian (A. cerana) honeybees, along with the invasive parasites of biosecurity concern they host (e.g., Varroa destructor). Our results demonstrate how skewed the establishment likelihood is, with one-tenth of the ports accounting for 80% or more of the likelihood for both species. These results have been utilized by biosecurity agencies in the allocation of resources to the surveillance of maritime ports. © 2015 Society for Risk Analysis.

  6. Constrained Maximum Likelihood Estimation for Two-Level Mean and Covariance Structure Models

    ERIC Educational Resources Information Center

    Bentler, Peter M.; Liang, Jiajuan; Tang, Man-Lai; Yuan, Ke-Hai

    2011-01-01

    Maximum likelihood is commonly used for the estimation of model parameters in the analysis of two-level structural equation models. Constraints on model parameters could be encountered in some situations such as equal factor loadings for different factors. Linear constraints are the most common ones and they are relatively easy to handle in…

  7. DEMES rotary joint: theories and applications

    NASA Astrophysics Data System (ADS)

    Wang, Shu; Hao, Zhaogang; Li, Mingyu; Huang, Bo; Sun, Lining; Zhao, Jianwen

    2017-04-01

    As a kind of dielectric elastomer actuators, dielectric elastomer minimum energy structure (DEMES) can realize large angular deformations by small voltage-induced strains, which make them an attractive candidate for use as biomimetic robotics. Considering the rotary joint is a basic and common component of many biomimetic robots, we have been fabricated rotary joint by DEMES and developed its performances in the past two years. In this paper, we have discussed the static analysis, dynamics analysis and some characteristics of the DEMES rotary joint. Based on theoretical analysis, some different applications of the DEMES rotary joint were presented, such as a flapping wing, a biomimetic fish and a two-legged walker. All of the robots are fabricated by DEMES rotary joint and can realize some basic biomimetic motions. Comparing with traditional rigid robot, the robot based on DEMES is soft and light, so it has advantage on the collision-resistant.

  8. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  9. Helical Axis Data Visualization and Analysis of the Knee Joint Articulation.

    PubMed

    Millán Vaquero, Ricardo Manuel; Vais, Alexander; Dean Lynch, Sean; Rzepecki, Jan; Friese, Karl-Ingo; Hurschler, Christof; Wolter, Franz-Erich

    2016-09-01

    We present processing methods and visualization techniques for accurately characterizing and interpreting kinematical data of flexion-extension motion of the knee joint based on helical axes. We make use of the Lie group of rigid body motions and particularly its Lie algebra for a natural representation of motion sequences. This allows to analyze and compute the finite helical axis (FHA) and instantaneous helical axis (IHA) in a unified way without redundant degrees of freedom or singularities. A polynomial fitting based on Legendre polynomials within the Lie algebra is applied to provide a smooth description of a given discrete knee motion sequence which is essential for obtaining stable instantaneous helical axes for further analysis. Moreover, this allows for an efficient overall similarity comparison across several motion sequences in order to differentiate among several cases. Our approach combines a specifically designed patient-specific three-dimensional visualization basing on the processed helical axes information and incorporating computed tomography (CT) scans for an intuitive interpretation of the axes and their geometrical relation with respect to the knee joint anatomy. In addition, in the context of the study of diseases affecting the musculoskeletal articulation, we propose to integrate the above tools into a multiscale framework for exploring related data sets distributed across multiple spatial scales. We demonstrate the utility of our methods, exemplarily processing a collection of motion sequences acquired from experimental data involving several surgery techniques. Our approach enables an accurate analysis, visualization and comparison of knee joint articulation, contributing to the evaluation and diagnosis in medical applications.

  10. Experimental Investigation of Composite Pressure Vessel Performance and Joint Stiffness for Pyramid and Inverted Pyramid Joints

    NASA Technical Reports Server (NTRS)

    Verhage, Joseph M.; Bower, Mark V.; Gilbert, Paul A. (Technical Monitor)

    2001-01-01

    The focus of this study is on the suitability in the application of classical laminate theory analysis tools for filament wound pressure vessels with adhesive laminated joints in particular: pressure vessel wall performance, joint stiffness and failure prediction. Two 18-inch diameter 12-ply filament wound pressure vessels were fabricated. One vessel was fabricated with a 24-ply pyramid laminated adhesive double strap butt joint. The second vessel was fabricated with the same number of plies in an inverted pyramid joint. Results from hydrostatic tests are presented. Experimental results were used as input to the computer programs GENLAM and Laminate, and the output compared to test. By using the axial stress resultant, the classical laminate theory results show a correlation within 1% to the experimental results in predicting the pressure vessel wall pressure performance. The prediction of joint stiffness for the two adhesive joints in the axial direction is within 1% of the experimental results. The calculated hoop direction joint stress resultant is 25% less than the measured resultant for both joint configurations. A correction factor is derived and used in the joint analysis. The correction factor is derived from the hoop stress resultant from the tank wall performance investigation. The vessel with the pyramid joint is determined to have failed in the joint area at a hydrostatic pressure 33% value below predicted failure. The vessel with the inverted pyramid joint failed in the wall acreage at a hydrostatic pressure within 10% of the actual failure pressure.

  11. Analysis of postmarket complaints database for the iFuse SI Joint Fusion System®: a minimally invasive treatment for degenerative sacroiliitis and sacroiliac joint disruption

    PubMed Central

    Miller, Larry E; Reckling, W Carlton; Block, Jon E

    2013-01-01

    Background The sacroiliac joint is a common but under-recognized source of low back and gluteal pain. Patients with degenerative sacroiliitis or sacroiliac joint disruption resistant to nonsurgical treatments may undergo open surgery with sacroiliac joint arthrodesis, although outcomes are mixed and risks are significant. Minimally invasive sacroiliac joint arthrodesis was developed to minimize the risk of iatrogenic injury and to improve patient outcomes compared with open surgery. Methods Between April 2009 and January 2013, 5319 patients were treated with the iFuse SI Joint Fusion System® for conditions including sacroiliac joint disruption and degenerative sacroiliitis. A database was prospectively developed to record all complaints reported to the manufacturer in patients treated with the iFuse device. Complaints were collected through spontaneous reporting mechanisms in support of ongoing mandatory postmarket surveillance efforts. Results Complaints were reported in 204 (3.8%) patients treated with the iFuse system. Pain was the most commonly reported clinical complaint (n = 119, 2.2%), with nerve impingement (n = 48, 0.9%) and recurrent sacroiliac joint pain (n = 43, 0.8%) most frequently cited. All other clinical complaints were rare (≤0.2%). Ninety-six revision surgeries were performed in 94 (1.8%) patients at a median follow-up of four (range 0–30) months. Revisions were typically performed in the early postoperative period for treatment of a symptomatic malpositioned implant (n = 46, 0.9%) or to correct an improperly sized implant in an asymptomatic patient (n = 10, 0.2%). Revisions in the late postoperative period were performed to treat symptom recurrence (n = 34, 0.6%) or for continued pain of undetermined etiology (n = 6, 0.1%). Conclusion Analysis of a postmarket product complaints database demonstrates an overall low risk of complaints with the iFuse SI Joint Fusion System in patients with degenerative sacroiliitis or sacroiliac joint

  12. Analysis of postmarket complaints database for the iFuse SI Joint Fusion System®: a minimally invasive treatment for degenerative sacroiliitis and sacroiliac joint disruption.

    PubMed

    Miller, Larry E; Reckling, W Carlton; Block, Jon E

    2013-01-01

    The sacroiliac joint is a common but under-recognized source of low back and gluteal pain. Patients with degenerative sacroiliitis or sacroiliac joint disruption resistant to nonsurgical treatments may undergo open surgery with sacroiliac joint arthrodesis, although outcomes are mixed and risks are significant. Minimally invasive sacroiliac joint arthrodesis was developed to minimize the risk of iatrogenic injury and to improve patient outcomes compared with open surgery. Between April 2009 and January 2013, 5319 patients were treated with the iFuse SI Joint Fusion System® for conditions including sacroiliac joint disruption and degenerative sacroiliitis. A database was prospectively developed to record all complaints reported to the manufacturer in patients treated with the iFuse device. Complaints were collected through spontaneous reporting mechanisms in support of ongoing mandatory postmarket surveillance efforts. Complaints were reported in 204 (3.8%) patients treated with the iFuse system. Pain was the most commonly reported clinical complaint (n = 119, 2.2%), with nerve impingement (n = 48, 0.9%) and recurrent sacroiliac joint pain (n = 43, 0.8%) most frequently cited. All other clinical complaints were rare (≤0.2%). Ninety-six revision surgeries were performed in 94 (1.8%) patients at a median follow-up of four (range 0-30) months. Revisions were typically performed in the early postoperative period for treatment of a symptomatic malpositioned implant (n = 46, 0.9%) or to correct an improperly sized implant in an asymptomatic patient (n = 10, 0.2%). Revisions in the late postoperative period were performed to treat symptom recurrence (n = 34, 0.6%) or for continued pain of undetermined etiology (n = 6, 0.1%). Analysis of a postmarket product complaints database demonstrates an overall low risk of complaints with the iFuse SI Joint Fusion System in patients with degenerative sacroiliitis or sacroiliac joint disruption.

  13. Maximum likelihood convolutional decoding (MCD) performance due to system losses

    NASA Technical Reports Server (NTRS)

    Webster, L.

    1976-01-01

    A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.

  14. Ego involvement increases doping likelihood.

    PubMed

    Ring, Christopher; Kavussanu, Maria

    2018-08-01

    Achievement goal theory provides a framework to help understand how individuals behave in achievement contexts, such as sport. Evidence concerning the role of motivation in the decision to use banned performance enhancing substances (i.e., doping) is equivocal on this issue. The extant literature shows that dispositional goal orientation has been weakly and inconsistently associated with doping intention and use. It is possible that goal involvement, which describes the situational motivational state, is a stronger determinant of doping intention. Accordingly, the current study used an experimental design to examine the effects of goal involvement, manipulated using direct instructions and reflective writing, on doping likelihood in hypothetical situations in college athletes. The ego-involving goal increased doping likelihood compared to no goal and a task-involving goal. The present findings provide the first evidence that ego involvement can sway the decision to use doping to improve athletic performance.

  15. cosmoabc: Likelihood-free inference for cosmology

    NASA Astrophysics Data System (ADS)

    Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.

    2015-05-01

    Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.

  16. Gait Analysis in Rats with Single Joint Inflammation: Influence of Experimental Factors

    PubMed Central

    Ängeby Möller, Kristina; Kinert, Susanne; Størkson, Rolf; Berge, Odd-Geir

    2012-01-01

    Disability and movement-related pain are major symptoms of joint disease, motivating the development of methods to quantify motor behaviour in rodent joint pain models. We used observational scoring and automated methods to compare weight bearing during locomotion and during standing after single joint inflammation induced by Freund's complete adjuvant (0.12–8.0 mg/mL) or carrageenan (0.47–30 mg/mL). Automated gait analysis was based on video capture of prints generated by light projected into the long edge of the floor of a walkway, producing an illuminated image of the contact area of each paw with light intensity reflecting the contact pressure. Weight bearing was calculated as an area-integrated paw pressure, that is, the light intensity of all pixels activated during the contact phase of a paw placement. Automated static weight bearing was measured with the Incapacitance tester. Pharmacological sensitivity of weight-bearing during locomotion was tested in carrageenan-induced monoarthritis by administration of the commonly used analgesics diclofenac, ibuprofen, and naproxen, as well as oxycodone and paracetamol. Observational scoring and automated quantification yielded similar results. We found that the window between control rats and monoarthritic rats was greater during locomotion. The response was more pronounced for inflammation in the ankle as compared to the knee, suggesting a methodological advantage of using this injection site. The effects of both Freund's complete adjuvant and carrageenan were concentration related, but Freund's incomplete adjuvant was found to be as effective as lower, commonly used concentrations of the complete adjuvant. The results show that gait analysis can be an effective method to quantify behavioural effects of single joint inflammation in the rat, sensitive to analgesic treatment. PMID:23071540

  17. Maximum likelihood estimation and EM algorithm of Copas-like selection model for publication bias correction.

    PubMed

    Ning, Jing; Chen, Yong; Piao, Jin

    2017-07-01

    Publication bias occurs when the published research results are systematically unrepresentative of the population of studies that have been conducted, and is a potential threat to meaningful meta-analysis. The Copas selection model provides a flexible framework for correcting estimates and offers considerable insight into the publication bias. However, maximizing the observed likelihood under the Copas selection model is challenging because the observed data contain very little information on the latent variable. In this article, we study a Copas-like selection model and propose an expectation-maximization (EM) algorithm for estimation based on the full likelihood. Empirical simulation studies show that the EM algorithm and its associated inferential procedure performs well and avoids the non-convergence problem when maximizing the observed likelihood. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Interventions for increasing ankle joint dorsiflexion: a systematic review and meta-analysis.

    PubMed

    Young, Rebekah; Nix, Sheree; Wholohan, Aaron; Bradhurst, Rachael; Reed, Lloyd

    2013-11-14

    Ankle joint equinus, or restricted dorsiflexion range of motion (ROM), has been linked to a range of pathologies of relevance to clinical practitioners. This systematic review and meta-analysis investigated the effects of conservative interventions on ankle joint ROM in healthy individuals and athletic populations. Keyword searches of Embase, Medline, Cochrane and CINAHL databases were performed with the final search being run in August 2013. Studies were eligible for inclusion if they assessed the effect of a non-surgical intervention on ankle joint dorsiflexion in healthy populations. Studies were quality rated using a standard quality assessment scale. Standardised mean differences (SMDs) and 95% confidence intervals (CIs) were calculated and results were pooled where study methods were homogenous. Twenty-three studies met eligibility criteria, with a total of 734 study participants. Results suggest that there is some evidence to support the efficacy of static stretching alone (SMDs: range 0.70 to 1.69) and static stretching in combination with ultrasound (SMDs: range 0.91 to 0.95), diathermy (SMD 1.12), diathermy and ice (SMD 1.16), heel raise exercises (SMDs: range 0.70 to 0.77), superficial moist heat (SMDs: range 0.65 to 0.84) and warm up (SMD 0.87) in improving ankle joint dorsiflexion ROM. Some evidence exists to support the efficacy of stretching alone and stretching in combination with other therapies in increasing ankle joint ROM in healthy individuals. There is a paucity of quality evidence to support the efficacy of other non-surgical interventions, thus further research in this area is warranted.

  19. Interventions for increasing ankle joint dorsiflexion: a systematic review and meta-analysis

    PubMed Central

    2013-01-01

    Background Ankle joint equinus, or restricted dorsiflexion range of motion (ROM), has been linked to a range of pathologies of relevance to clinical practitioners. This systematic review and meta-analysis investigated the effects of conservative interventions on ankle joint ROM in healthy individuals and athletic populations. Methods Keyword searches of Embase, Medline, Cochrane and CINAHL databases were performed with the final search being run in August 2013. Studies were eligible for inclusion if they assessed the effect of a non-surgical intervention on ankle joint dorsiflexion in healthy populations. Studies were quality rated using a standard quality assessment scale. Standardised mean differences (SMDs) and 95% confidence intervals (CIs) were calculated and results were pooled where study methods were homogenous. Results Twenty-three studies met eligibility criteria, with a total of 734 study participants. Results suggest that there is some evidence to support the efficacy of static stretching alone (SMDs: range 0.70 to 1.69) and static stretching in combination with ultrasound (SMDs: range 0.91 to 0.95), diathermy (SMD 1.12), diathermy and ice (SMD 1.16), heel raise exercises (SMDs: range 0.70 to 0.77), superficial moist heat (SMDs: range 0.65 to 0.84) and warm up (SMD 0.87) in improving ankle joint dorsiflexion ROM. Conclusions Some evidence exists to support the efficacy of stretching alone and stretching in combination with other therapies in increasing ankle joint ROM in healthy individuals. There is a paucity of quality evidence to support the efficacy of other non-surgical interventions, thus further research in this area is warranted. PMID:24225348

  20. Soldier-relevant body borne loads increase knee joint contact force during a run-to-stop maneuver.

    PubMed

    Ramsay, John W; Hancock, Clifford L; O'Donovan, Meghan P; Brown, Tyler N

    2016-12-08

    The purpose of this study was to understand the effects of load carriage on human performance, specifically during a run-to-stop (RTS) task. Using OpenSim analysis tools, knee joint contact force, grounds reaction force, leg stiffness and lower extremity joint angles and moments were determined for nine male military personnel performing a RTS under three load configurations (light, ~6kg, medium, ~20kg, and heavy, ~40kg). Subject-based means for each biomechanical variable were submitted to repeated measures ANOVA to test the effects of load. During the RTS, body borne load significantly increased peak knee joint contact force by 1.2 BW (p<0.001) and peak vertical (p<0.001) and anterior-posterior (p=0.002) ground reaction forces by 0.6 BW and 0.3 BW, respectively. Body borne load also had a significant effect on hip (p=0.026) posture with the medium load and knee (p=0.046) posture with the heavy load. With the heavy load, participants exhibited a substantial, albeit non-significant increase in leg stiffness (p=0.073 and d=0.615). Increases in joint contact force exhibited during the RTS were primarily due to greater GRFs that impact the soldier with each incremental addition of body borne load. The stiff leg, extended knee and large braking force the soldiers exhibited with the heavy load suggests their injury risk may be greatest with that specific load configuration. Further work is needed to determine if the biomechanical profile exhibited with the heavy load configuration translates to unsafe shear forces at the knee joint and consequently, a higher likelihood of injury. Published by Elsevier Ltd.

  1. Minimizing pre- and post-defibrillation pauses increases the likelihood of return of spontaneous circulation (ROSC).

    PubMed

    Sell, Rebecca E; Sarno, Renee; Lawrence, Brenna; Castillo, Edward M; Fisher, Roger; Brainard, Criss; Dunford, James V; Davis, Daniel P

    2010-07-01

    The three-phase model of ventricular fibrillation (VF) arrest suggests a period of compressions to "prime" the heart prior to defibrillation attempts. In addition, post-shock compressions may increase the likelihood of return of spontaneous circulation (ROSC). The optimal intervals for shock delivery following cessation of compressions (pre-shock interval) and resumption of compressions following a shock (post-shock interval) remain unclear. To define optimal pre- and post-defibrillation compression pauses for out-of-hospital cardiac arrest (OOHCA). All patients suffering OOHCA from VF were identified over a 1-month period. Defibrillator data were abstracted and analyzed using the combination of ECG, impedance, and audio recording. Receiver-operator curve (ROC) analysis was used to define the optimal pre- and post-shock compression intervals. Multiple logistic regression analysis was used to quantify the relationship between these intervals and ROSC. Covariates included cumulative number of defibrillation attempts, intubation status, and administration of epinephrine in the immediate pre-shock compression cycle. Cluster adjustment was performed due to the possibility of multiple defibrillation attempts for each patient. A total of 36 patients with 96 defibrillation attempts were included. The ROC analysis identified an optimal pre-shock interval of <3s and an optimal post-shock interval of <6s. Increased likelihood of ROSC was observed with a pre-shock interval <3s (adjusted OR 6.7, 95% CI 2.0-22.3, p=0.002) and a post-shock interval of <6s (adjusted OR 10.7, 95% CI 2.8-41.4, p=0.001). Likelihood of ROSC was substantially increased with the optimization of both pre- and post-shock intervals (adjusted OR 13.1, 95% CI 3.4-49.9, p<0.001). Decreasing pre- and post-shock compression intervals increases the likelihood of ROSC in OOHCA from VF.

  2. JCDSA: a joint covariate detection tool for survival analysis on tumor expression profiles.

    PubMed

    Wu, Yiming; Liu, Yanan; Wang, Yueming; Shi, Yan; Zhao, Xudong

    2018-05-29

    Survival analysis on tumor expression profiles has always been a key issue for subsequent biological experimental validation. It is crucial how to select features which closely correspond to survival time. Furthermore, it is important how to select features which best discriminate between low-risk and high-risk group of patients. Common features derived from the two aspects may provide variable candidates for prognosis of cancer. Based on the provided two-step feature selection strategy, we develop a joint covariate detection tool for survival analysis on tumor expression profiles. Significant features, which are not only consistent with survival time but also associated with the categories of patients with different survival risks, are chosen. Using the miRNA expression data (Level 3) of 548 patients with glioblastoma multiforme (GBM) as an example, miRNA candidates for prognosis of cancer are selected. The reliability of selected miRNAs using this tool is demonstrated by 100 simulations. Furthermore, It is discovered that significant covariates are not directly composed of individually significant variables. Joint covariate detection provides a viewpoint for selecting variables which are not individually but jointly significant. Besides, it helps to select features which are not only consistent with survival time but also associated with prognosis risk. The software is available at http://bio-nefu.com/resource/jcdsa .

  3. Multimodal Likelihoods in Educational Assessment: Will the Real Maximum Likelihood Score Please Stand up?

    ERIC Educational Resources Information Center

    Wothke, Werner; Burket, George; Chen, Li-Sue; Gao, Furong; Shu, Lianghua; Chia, Mike

    2011-01-01

    It has been known for some time that item response theory (IRT) models may exhibit a likelihood function of a respondent's ability which may have multiple modes, flat modes, or both. These conditions, often associated with guessing of multiple-choice (MC) questions, can introduce uncertainty and bias to ability estimation by maximum likelihood…

  4. Joint Stability in Total Knee Arthroplasty: What Is the Target for a Stable Knee?

    PubMed

    Wright, Timothy M

    2017-02-01

    Instability remains a common cause of failure in total knee arthroplasty. Although approaches for surgical treatment of instability exist, the target for initial stability remains elusive, increasing the likelihood that failures will persist because adequate stability is not restored when performing the primary arthroplasty. Although the mechanisms that stabilize the knee joint-contact between the articular surfaces, ligamentous constraints, and muscle forces-are well-defined, their relative importance and the interplay among them throughout functions of daily living are poorly understood. The problem is exacerbated by the complex multiplanar motions that occur across the joint and the large variations in these motions across the population, suggesting that stability targets may need to be patient-specific.

  5. Gaussian copula as a likelihood function for environmental models

    NASA Astrophysics Data System (ADS)

    Wani, O.; Espadas, G.; Cecinati, F.; Rieckermann, J.

    2017-12-01

    Parameter estimation of environmental models always comes with uncertainty. To formally quantify this parametric uncertainty, a likelihood function needs to be formulated, which is defined as the probability of observations given fixed values of the parameter set. A likelihood function allows us to infer parameter values from observations using Bayes' theorem. The challenge is to formulate a likelihood function that reliably describes the error generating processes which lead to the observed monitoring data, such as rainfall and runoff. If the likelihood function is not representative of the error statistics, the parameter inference will give biased parameter values. Several uncertainty estimation methods that are currently being used employ Gaussian processes as a likelihood function, because of their favourable analytical properties. Box-Cox transformation is suggested to deal with non-symmetric and heteroscedastic errors e.g. for flow data which are typically more uncertain in high flows than in periods with low flows. Problem with transformations is that the results are conditional on hyper-parameters, for which it is difficult to formulate the analyst's belief a priori. In an attempt to address this problem, in this research work we suggest learning the nature of the error distribution from the errors made by the model in the "past" forecasts. We use a Gaussian copula to generate semiparametric error distributions . 1) We show that this copula can be then used as a likelihood function to infer parameters, breaking away from the practice of using multivariate normal distributions. Based on the results from a didactical example of predicting rainfall runoff, 2) we demonstrate that the copula captures the predictive uncertainty of the model. 3) Finally, we find that the properties of autocorrelation and heteroscedasticity of errors are captured well by the copula, eliminating the need to use transforms. In summary, our findings suggest that copulas are an

  6. Comparative study of joint analysis of microarray gene expression data in survival prediction and risk assessment of breast cancer patients

    PubMed Central

    2016-01-01

    Abstract Microarray gene expression data sets are jointly analyzed to increase statistical power. They could either be merged together or analyzed by meta-analysis. For a given ensemble of data sets, it cannot be foreseen which of these paradigms, merging or meta-analysis, works better. In this article, three joint analysis methods, Z -score normalization, ComBat and the inverse normal method (meta-analysis) were selected for survival prognosis and risk assessment of breast cancer patients. The methods were applied to eight microarray gene expression data sets, totaling 1324 patients with two clinical endpoints, overall survival and relapse-free survival. The performance derived from the joint analysis methods was evaluated using Cox regression for survival analysis and independent validation used as bias estimation. Overall, Z -score normalization had a better performance than ComBat and meta-analysis. Higher Area Under the Receiver Operating Characteristic curve and hazard ratio were also obtained when independent validation was used as bias estimation. With a lower time and memory complexity, Z -score normalization is a simple method for joint analysis of microarray gene expression data sets. The derived findings suggest further assessment of this method in future survival prediction and cancer classification applications. PMID:26504096

  7. Objective function analysis for electric soundings (VES), transient electromagnetic soundings (TEM) and joint inversion VES/TEM

    NASA Astrophysics Data System (ADS)

    Bortolozo, Cassiano Antonio; Bokhonok, Oleg; Porsani, Jorge Luís; Monteiro dos Santos, Fernando Acácio; Diogo, Liliana Alcazar; Slob, Evert

    2017-11-01

    Ambiguities in geophysical inversion results are always present. How these ambiguities appear in most cases open to interpretation. It is interesting to investigate ambiguities with regard to the parameters of the models under study. Residual Function Dispersion Map (RFDM) can be used to differentiate between global ambiguities and local minima in the objective function. We apply RFDM to Vertical Electrical Sounding (VES) and TEM Sounding inversion results. Through topographic analysis of the objective function we evaluate the advantages and limitations of electrical sounding data compared with TEM sounding data, and the benefits of joint inversion in comparison with the individual methods. The RFDM analysis proved to be a very interesting tool for understanding the joint inversion method of VES/TEM. Also the advantage of the applicability of the RFDM analyses in real data is explored in this paper to demonstrate not only how the objective function of real data behaves but the applicability of the RFDM approach in real cases. With the analysis of the results, it is possible to understand how the joint inversion can reduce the ambiguity of the methods.

  8. Radiosteriometric analysis of movement in the sacroiliac joint during a single-leg stance in patients with long-lasting pelvic girdle pain.

    PubMed

    Kibsgård, Thomas J; Røise, Olav; Sturesson, Bengt; Röhrl, Stephan M; Stuge, Britt

    2014-04-01

    Chamberlain's projections (anterior-posterior X-ray of the pubic symphysis) have been used to diagnose sacroiliac joint mobility during the single-leg stance test. This study examined the movement in the sacroiliac joint during the single-leg stance test with precise radiostereometric analysis. Under general anesthesia, tantalum markers were inserted into the dorsal sacrum and the ilium of 11 patients with long-lasting and severe pelvic girdle pain. After two to three weeks, a radiostereometric analysis was conducted while the subjects performed a single-leg stance. Small movements were detected in the sacroiliac joint during the single-leg stance. In both the standing- and hanging-leg sacroiliac join, a total of 0.5 degree rotation was observed; however, no translations were detected. There were no differences in total movement between the standing- and hanging-leg sacroiliac joint. The movement in the sacroiliac joint during the single-leg stance is small and almost undetectable by the precise radiostereometric analysis. A complex movement pattern was seen during the test, with a combination of movements in the two joints. The interpretation of the results of this study is that, the Chamberlain examination likely is inadequate in the examination of sacroiliac joint movement in patients with pelvic girdle pain. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Investigation of 2‐stage meta‐analysis methods for joint longitudinal and time‐to‐event data through simulation and real data application

    PubMed Central

    Tudur Smith, Catrin; Gueyffier, François; Kolamunnage‐Dona, Ruwanthi

    2017-01-01

    Background Joint modelling of longitudinal and time‐to‐event data is often preferred over separate longitudinal or time‐to‐event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time‐to‐event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta‐analysis of joint model estimates from multiple studies. Methods We propose a 2‐stage method for meta‐analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta‐analyses of separate longitudinal or time‐to‐event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Results Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta‐analytic setting where association exists between the longitudinal and time‐to‐event outcomes. Conclusions Where evidence of association between longitudinal and time‐to‐event outcomes exists, results from joint models over standalone analyses should be pooled in 2‐stage meta‐analyses. PMID:29250814

  10. Spectral likelihood expansions for Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nagel, Joseph B.; Sudret, Bruno

    2016-03-01

    A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.

  11. Gait analysis and weight bearing in pre-clinical joint pain research.

    PubMed

    Ängeby Möller, Kristina; Svärd, Heta; Suominen, Anni; Immonen, Jarmo; Holappa, Johanna; Stenfors, Carina

    2018-04-15

    There is a need for better joint pain treatment, but development of new medication has not been successful. Pre-clinical models with readouts that better reflect the clinical situation are needed. In patients with joint pain, pain at rest and pain at walking are two major complaints. We describe a new way of calculating results from gait analysis using the CatWalk™ setup. Rats with monoarthritis induced by injection of Complete Freund's Adjuvant (CFA) intra-articularly into the ankle joint of one hind limb were used to assess gait and dynamic weight bearing. The results show that dynamic weight bearing was markedly reduced for the injected paw. Gait parameters such as amount of normal step sequences, walking speed and duration of step placement were also affected. Treatment with naproxen (an NSAID commonly used for inflammatory pain) attenuated the CFA-induced effects. Pregabalin, which is used for neuropathic pain, had no effect. Reduced dynamic weight bearing during locomotion, assessed and calculated in the way we present here, showed a dose-dependent and lasting normalization after naproxen treatment. In contrast, static weight bearing while standing (Incapacitance tester) showed a significant effect for a limited time only. Mechanical sensitivity (von Frey Optihairs) was completely normalized by naproxen, and the window for testing pharmacological effect disappeared. Objective and reproducible effects, with an endpoint showing face validity compared to pain while walking in patients with joint pain, are achieved by a new way of calculating dynamic weight bearing in monoarthritic rats. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Joint Motion Quality in Chondromalacia Progression Assessed by Vibroacoustic Signal Analysis.

    PubMed

    Bączkowicz, Dawid; Majorczyk, Edyta

    2016-11-01

    Because of the specific biomechanical environment of the patellofemoral joint, chondral disorders, including chondromalacia, often are observed in this articulation. Chondromalacia via pathologic changes in cartilage may lead to qualitative impairment of knee joint motion. To determine the patellofemoral joint motion quality in particular chondromalacia stages and to compare with controls. Retrospective, comparative study. Voivodship hospitals, university biomechanical laboratory. A total of 89 knees with chondromalacia (25 with stage I; 30 with stage II and 34 with stage III) from 50 patients and 64 control healthy knees (from 32 individuals). Vibroacoustic signal pattern analysis of joint motion quality. For all knees vibroacoustic signals were recorded. Each obtained signal was described by variation of mean square, mean range (R4), and power spectral density for frequency of 50-250 Hz (P1) and 250-450 Hz (P2) parameters. Differences between healthy controls and all chondromalacic knees as well as chondromalacia patellae groups were observed as an increase of analyzed parameters (P < .001) with only one exception. No statistically significant difference between control group and stage I of chondromalacia patellae was found. All chondromalacia groups were differentiated by the use of all analyzed parameters (P < .01), whose values correspond to the progress of chondromalacia. Chondromalacia generates abnormal vibroacoustic signals, and there seems to be a relationship between the level of signal amplitude as well as frequency and cartilage destruction from the superficial layer to the subchondral bone. IV. Copyright © 2016 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  13. Maximum-likelihood estimation of parameterized wavefronts from multifocal data

    PubMed Central

    Sakamoto, Julia A.; Barrett, Harrison H.

    2012-01-01

    A method for determining the pupil phase distribution of an optical system is demonstrated. Coefficients in a wavefront expansion were estimated using likelihood methods, where the data consisted of multiple irradiance patterns near focus. Proof-of-principle results were obtained in both simulation and experiment. Large-aberration wavefronts were handled in the numerical study. Experimentally, we discuss the handling of nuisance parameters. Fisher information matrices, Cramér-Rao bounds, and likelihood surfaces are examined. ML estimates were obtained by simulated annealing to deal with numerous local extrema in the likelihood function. Rapid processing techniques were employed to reduce the computational time. PMID:22772282

  14. Speech perception in autism spectrum disorder: An activation likelihood estimation meta-analysis.

    PubMed

    Tryfon, Ana; Foster, Nicholas E V; Sharda, Megha; Hyde, Krista L

    2018-02-15

    Autism spectrum disorder (ASD) is often characterized by atypical language profiles and auditory and speech processing. These can contribute to aberrant language and social communication skills in ASD. The study of the neural basis of speech perception in ASD can serve as a potential neurobiological marker of ASD early on, but mixed results across studies renders it difficult to find a reliable neural characterization of speech processing in ASD. To this aim, the present study examined the functional neural basis of speech perception in ASD versus typical development (TD) using an activation likelihood estimation (ALE) meta-analysis of 18 qualifying studies. The present study included separate analyses for TD and ASD, which allowed us to examine patterns of within-group brain activation as well as both common and distinct patterns of brain activation across the ASD and TD groups. Overall, ASD and TD showed mostly common brain activation of speech processing in bilateral superior temporal gyrus (STG) and left inferior frontal gyrus (IFG). However, the results revealed trends for some distinct activation in the TD group showing additional activation in higher-order brain areas including left superior frontal gyrus (SFG), left medial frontal gyrus (MFG), and right IFG. These results provide a more reliable neural characterization of speech processing in ASD relative to previous single neuroimaging studies and motivate future work to investigate how these brain signatures relate to behavioral measures of speech processing in ASD. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. The failure analysis and lifetime prediction for the solder joint of the magnetic head

    NASA Astrophysics Data System (ADS)

    Xiao, Xianghui; Peng, Minfang; Cardoso, Jaime S.; Tang, Rongjun; Zhou, YingLiang

    2015-02-01

    Micro-solder joint (MSJ) lifetime prediction methodology and failure analysis (FA) are to assess reliability by fatigue model with a series of theoretical calculations, numerical simulation and experimental method. Due to shortened time of solder joints on high-temperature, high-frequency sampling error that is not allowed in productions may exist in various models, including round-off error. Combining intermetallic compound (IMC) growth theory and the FA technology for the magnetic head in actual production, this thesis puts forward a new growth model to predict life expectancy for solder joint of the magnetic head. And the impact of IMC, generating from interface reaction between slider (magnetic head, usually be called slider) and bonding pad, on mechanical performance during aging process is analyzed in it. By further researching on FA of solder ball bonding, thesis chooses AuSn4 growth model that affects least to solder joint mechanical property to indicate that the IMC methodology is suitable to forecast the solder lifetime. And the diffusion constant under work condition 60 °C is 0.015354; the solder lifetime t is 14.46 years.

  16. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  17. Clinical Benefits of Joint Mobilization on Ankle Sprains: A Systematic Review and Meta-Analysis.

    PubMed

    Weerasekara, Ishanka; Osmotherly, Peter; Snodgrass, Suzanne; Marquez, Jodie; de Zoete, Rutger; Rivett, Darren A

    2018-07-01

    To assess the clinical benefits of joint mobilization for ankle sprains. MEDLINE, MEDLINE In-Process, Embase, AMED, PsycINFO, CINAHL, Cochrane Library, PEDro, Scopus, SPORTDiscus, and Dissertations and Theses were searched from inception to June 2017. Studies investigating humans with grade I or II lateral or medial sprains of the ankle in any pathologic state from acute to chronic, who had been treated with joint mobilization were considered for inclusion. Any conservative intervention was considered as a comparator. Commonly reported clinical outcomes were considered such as ankle range of movement, pain, and function. After screening of 1530 abstracts, 56 studies were selected for full-text screening, and 23 were eligible for inclusion. Eleven studies on chronic sprains reported sufficient data for meta-analysis. Data were extracted using the participants, interventions, comparison, outcomes, and study design approach. Clinically relevant outcomes (dorsiflexion range, proprioception, balance, function, pain threshold, pain intensity) were assessed at immediate, short-term, and long-term follow-up points. Methodological quality was assessed independently by 2 reviewers, and most studies were found to be of moderate quality, with no studies rated as poor. Meta-analysis revealed significant immediate benefits of joint mobilization compared with comparators on improving posteromedial dynamic balance (P=.0004), but not for improving dorsiflexion range (P=.16), static balance (P=.96), or pain intensity (P=.45). Joint mobilization was beneficial in the short-term for improving weight-bearing dorsiflexion range (P=.003) compared with a control. Joint mobilization appears to be beneficial for improving dynamic balance immediately after application, and dorsiflexion range in the short-term. Long-term benefits have not been adequately investigated. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  19. Design, analysis and verification of a knee joint oncological prosthesis finite element model.

    PubMed

    Zach, Lukáš; Kunčická, Lenka; Růžička, Pavel; Kocich, Radim

    2014-11-01

    The aim of this paper was to design a finite element model for a hinged PROSPON oncological knee endoprosthesis and to verify the model by comparison with ankle flexion angle using knee-bending experimental data obtained previously. Visible Human Project CT scans were used to create a general lower extremity bones model and to compose a 3D CAD knee joint model to which muscles and ligaments were added. Into the assembly the designed finite element PROSPON prosthesis model was integrated and an analysis focused on the PEEK-OPTIMA hinge pin bushing stress state was carried out. To confirm the stress state analysis results, contact pressure was investigated. The analysis was performed in the knee-bending position within 15.4-69.4° hip joint flexion range. The results showed that the maximum stress achieved during the analysis (46.6 MPa) did not exceed the yield strength of the material (90 MPa); the condition of plastic stability was therefore met. The stress state analysis results were confirmed by the distribution of contact pressure during knee-bending. The applicability of our designed finite element model for the real implant behaviour prediction was proven on the basis of good correlation of the analytical and experimental ankle flexion angle data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. MXLKID: a maximum likelihood parameter identifier. [In LRLTRAN for CDC 7600

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavel, D.T.

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables.

  1. Maximum likelihood clustering with dependent feature trees

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B. (Principal Investigator)

    1981-01-01

    The decomposition of mixture density of the data into its normal component densities is considered. The densities are approximated with first order dependent feature trees using criteria of mutual information and distance measures. Expressions are presented for the criteria when the densities are Gaussian. By defining different typs of nodes in a general dependent feature tree, maximum likelihood equations are developed for the estimation of parameters using fixed point iterations. The field structure of the data is also taken into account in developing maximum likelihood equations. Experimental results from the processing of remotely sensed multispectral scanner imagery data are included.

  2. Nonlinear Analysis of Bonded Composite Single-LAP Joints

    NASA Technical Reports Server (NTRS)

    Oterkus, E.; Barut, A.; Madenci, E.; Smeltzer, S. S.; Ambur, D. R.

    2004-01-01

    This study presents a semi-analytical solution method to analyze the geometrically nonlinear response of bonded composite single-lap joints with tapered adherend edges under uniaxial tension. The solution method provides the transverse shear and normal stresses in the adhesive and in-plane stress resultants and bending moments in the adherends. The method utilizes the principle of virtual work in conjunction with von Karman s nonlinear plate theory to model the adherends and the shear lag model to represent the kinematics of the thin adhesive layer between the adherends. Furthermore, the method accounts for the bilinear elastic material behavior of the adhesive while maintaining a linear stress-strain relationship in the adherends. In order to account for the stiffness changes due to thickness variation of the adherends along the tapered edges, their in-plane and bending stiffness matrices are varied as a function of thickness along the tapered region. The combination of these complexities results in a system of nonlinear governing equilibrium equations. This approach represents a computationally efficient alternative to finite element method. Comparisons are made with corresponding results obtained from finite-element analysis. The results confirm the validity of the solution method. The numerical results present the effects of taper angle, adherend overlap length, and the bilinear adhesive material on the stress fields in the adherends, as well as the adhesive, of a single-lap joint

  3. Likelihood of Suicidality at Varying Levels of Depression Severity: A Re-Analysis of NESARC Data

    ERIC Educational Resources Information Center

    Uebelacker, Lisa A.; Strong, David; Weinstock, Lauren M.; Miller, Ivan W.

    2010-01-01

    Although it is clear that increasing depression severity is associated with more risk for suicidality, less is known about at what levels of depression severity the risk for different suicide symptoms increases. We used item response theory to estimate the likelihood of endorsing suicide symptoms across levels of depression severity in an…

  4. Multiple Cognitive Control Effects of Error Likelihood and Conflict

    PubMed Central

    Brown, Joshua W.

    2010-01-01

    Recent work on cognitive control has suggested a variety of performance monitoring functions of the anterior cingulate cortex, such as errors, conflict, error likelihood, and others. Given the variety of monitoring effects, a corresponding variety of control effects on behavior might be expected. This paper explores whether conflict and error likelihood produce distinct cognitive control effects on behavior, as measured by response time. A change signal task (Brown & Braver, 2005) was modified to include conditions of likely errors due to tardy as well as premature responses, in conditions with and without conflict. The results discriminate between competing hypotheses of independent vs. interacting conflict and error likelihood control effects. Specifically, the results suggest that the likelihood of premature vs. tardy response errors can lead to multiple distinct control effects, which are independent of cognitive control effects driven by response conflict. As a whole, the results point to the existence of multiple distinct cognitive control mechanisms and challenge existing models of cognitive control that incorporate only a single control signal. PMID:19030873

  5. Joint Analysis of X-Ray and Sunyaev-Zel'Dovich Observations of Galaxy Clusters Using an Analytic Model of the Intracluster Medium

    NASA Technical Reports Server (NTRS)

    Hasler, Nicole; Bulbul, Esra; Bonamente, Massimiliano; Carlstrom, John E.; Culverhouse, Thomas L.; Gralla, Megan; Greer, Christopher; Lamb, James W.; Hawkins, David; Hennessy, Ryan; hide

    2012-01-01

    We perform a joint analysis of X-ray and Sunyaev-Zel'dovich effect data using an analytic model that describes the gas properties of galaxy clusters. The joint analysis allows the measurement of the cluster gas mass fraction profile and Hubble constant independent of cosmological parameters. Weak cosmological priors are used to calculate the overdensity radius within which the gas mass fractions are reported. Such an analysis can provide direct constraints on the evolution of the cluster gas mass fraction with redshift. We validate the model and the joint analysis on high signal-to-noise data from the Chandra X-ray Observatory and the Sunyaev-Zel'dovich Array for two clusters, A2631 and A2204.

  6. Joint models for longitudinal and time-to-event data: a review of reporting quality with a view to meta-analysis.

    PubMed

    Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin

    2016-12-05

    Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.

  7. A stochastic Iwan-type model for joint behavior variability modeling

    NASA Astrophysics Data System (ADS)

    Mignolet, Marc P.; Song, Pengchao; Wang, X. Q.

    2015-08-01

    This paper focuses overall on the development and validation of a stochastic model to describe the dissipation and stiffness properties of a bolted joint for which experimental data is available and exhibits a large scatter. An extension of the deterministic parallel-series Iwan model for the characterization of the force-displacement behavior of joints is first carried out. This new model involves dynamic and static coefficients of friction differing from each other and a broadly defined distribution of Jenkins elements. Its applicability is next investigated using the experimental data, i.e. stiffness and dissipation measurements obtained in harmonic testing of 9 nominally identical bolted joints. The model is found to provide a very good fit of the experimental data for each bolted joint notwithstanding the significant variability of their behavior. This finding suggests that this variability can be simulated through the randomization of only the parameters of the proposed Iwan-type model. The distribution of these parameters is next selected based on maximum entropy concepts and their corresponding parameters, i.e. the hyperparameters of the model, are identified using a maximum likelihood strategy. Proceeding with a Monte Carlo simulation of this stochastic Iwan model demonstrates that the experimental data fits well within the uncertainty band corresponding to the 5th and 95th percentiles of the model predictions which well supports the adequacy of the modeling effort.

  8. Numerical analysis of dynamic behavior of pre-stressed shape memory alloy concrete beam-column joints

    NASA Astrophysics Data System (ADS)

    Yan, S.; Xiao, Z. F.; Lin, M. Y.; Niu, J.

    2018-04-01

    Beam-column joints are important parts of a main frame structure. Mechanical properties of beam-column joints have a great influence on dynamic performances of the frame structure. Shape memory alloy (SMA) as a new type of intelligent metal materials has wide applications in civil engineering. The paper aims at proposing a novel beam-column joint reinforced with pre-stressed SMA tendons to increase its dynamic performance. Based on the finite element analysis (FEA) software ABAQUS, a numerical simulation for 6 beam-column scaled models considering different SMA reinforcement ratios and pre-stress levels was performed, focusing on bearing capacities, energy-dissipation and self-centering capacities, etc. These models were numerically tested under a pseudo-static load on the beam end, companying a constant vertical compressive load on the top of the column. The numerical results show that the proposed SMA-reinforced joint has a significantly increased bearing capacity and a good self-centering capability after unloading even though the energy-dissipation capacity becomes smaller due the less residual deformation. The concept and mechanism of the novel joint can be used as an important reference for civil engineering applications.

  9. Factors associated with the likelihood of Giardia spp. and Cryptosporidium spp. in soil from dairy farms.

    PubMed

    Barwick, R S; Mohammed, H O; White, M E; Bryant, R B

    2003-03-01

    A study was conducted to identify factors associated with the likelihood of detecting Giardia spp. and Cryptosporidium spp. in the soil of dairy farms in a watershed area. A total of 37 farms were visited, and 782 soil samples were collected from targeted areas on these farms. The samples were analyzed for the presence of Cryptosporidium spp. oocysts, Giardia spp. cysts, percent moisture content, and pH. Logistic regression analysis was used to identify risk factors associated with the likelihood of the presence of these organisms. The use of the land at the sampling site was associated with the likelihood of environmental contamination with Cryptosporidium spp. Barn cleaner equipment area and agricultural fields were associated with increased likelihood of environmental contamination with Cryptosporidium spp. The risk of environmental contamination decreased with the pH of the soil and with the score of the potential likelihood of Cryptosporidium spp. The size of the sampling site, as determined by the sampling design, in square feet, was associated nonlinearly with the risk of detecting Cryptosporidium spp. The likelihood of the Giardia cyst in the soil increased with the prevalence of Giardia spp. in animals (i.e., 18 to 39%). As the size of the farm increased, there was decreased risk of Giardia spp. in the soil, and sampling sites which were covered with brush or bare soil showed a decrease in likelihood of detecting Giardia spp. when compared to land which had managed grass. The number of cattle on the farm less than 6 mo of age was negatively associated with the risk of detecting Giardia spp. in the soil, and the percent moisture content was positively associated with the risk of detecting Giardia spp. Our study showed that these two protozoan exist in dairy farm soil at different rates, and this risk could be modified by manipulating the pH of the soil.

  10. Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation

    NASA Astrophysics Data System (ADS)

    Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.

    2016-12-01

    With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.

  11. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    PubMed

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  12. Exclusion probabilities and likelihood ratios with applications to kinship problems.

    PubMed

    Slooten, Klaas-Jan; Egeland, Thore

    2014-05-01

    In forensic genetics, DNA profiles are compared in order to make inferences, paternity cases being a standard example. The statistical evidence can be summarized and reported in several ways. For example, in a paternity case, the likelihood ratio (LR) and the probability of not excluding a random man as father (RMNE) are two common summary statistics. There has been a long debate on the merits of the two statistics, also in the context of DNA mixture interpretation, and no general consensus has been reached. In this paper, we show that the RMNE is a certain weighted average of inverse likelihood ratios. This is true in any forensic context. We show that the likelihood ratio in favor of the correct hypothesis is, in expectation, bigger than the reciprocal of the RMNE probability. However, with the exception of pathological cases, it is also possible to obtain smaller likelihood ratios. We illustrate this result for paternity cases. Moreover, some theoretical properties of the likelihood ratio for a large class of general pairwise kinship cases, including expected value and variance, are derived. The practical implications of the findings are discussed and exemplified.

  13. Incorrect likelihood methods were used to infer scaling laws of marine predator search behaviour.

    PubMed

    Edwards, Andrew M; Freeman, Mervyn P; Breed, Greg A; Jonsen, Ian D

    2012-01-01

    Ecologists are collecting extensive data concerning movements of animals in marine ecosystems. Such data need to be analysed with valid statistical methods to yield meaningful conclusions. We demonstrate methodological issues in two recent studies that reached similar conclusions concerning movements of marine animals (Nature 451:1098; Science 332:1551). The first study analysed vertical movement data to conclude that diverse marine predators (Atlantic cod, basking sharks, bigeye tuna, leatherback turtles and Magellanic penguins) exhibited "Lévy-walk-like behaviour", close to a hypothesised optimal foraging strategy. By reproducing the original results for the bigeye tuna data, we show that the likelihood of tested models was calculated from residuals of regression fits (an incorrect method), rather than from the likelihood equations of the actual probability distributions being tested. This resulted in erroneous Akaike Information Criteria, and the testing of models that do not correspond to valid probability distributions. We demonstrate how this led to overwhelming support for a model that has no biological justification and that is statistically spurious because its probability density function goes negative. Re-analysis of the bigeye tuna data, using standard likelihood methods, overturns the original result and conclusion for that data set. The second study observed Lévy walk movement patterns by mussels. We demonstrate several issues concerning the likelihood calculations (including the aforementioned residuals issue). Re-analysis of the data rejects the original Lévy walk conclusion. We consequently question the claimed existence of scaling laws of the search behaviour of marine predators and mussels, since such conclusions were reached using incorrect methods. We discourage the suggested potential use of "Lévy-like walks" when modelling consequences of fishing and climate change, and caution that any resulting advice to managers of marine ecosystems

  14. From reads to genes to pathways: differential expression analysis of RNA-Seq experiments using Rsubread and the edgeR quasi-likelihood pipeline.

    PubMed

    Chen, Yunshun; Lun, Aaron T L; Smyth, Gordon K

    2016-01-01

    In recent years, RNA sequencing (RNA-seq) has become a very widely used technology for profiling gene expression. One of the most common aims of RNA-seq profiling is to identify genes or molecular pathways that are differentially expressed (DE) between two or more biological conditions. This article demonstrates a computational workflow for the detection of DE genes and pathways from RNA-seq data by providing a complete analysis of an RNA-seq experiment profiling epithelial cell subsets in the mouse mammary gland. The workflow uses R software packages from the open-source Bioconductor project and covers all steps of the analysis pipeline, including alignment of read sequences, data exploration, differential expression analysis, visualization and pathway analysis. Read alignment and count quantification is conducted using the Rsubread package and the statistical analyses are performed using the edgeR package. The differential expression analysis uses the quasi-likelihood functionality of edgeR.

  15. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    PubMed Central

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  16. Joint principal trend analysis for longitudinal high-dimensional data.

    PubMed

    Zhang, Yuping; Ouyang, Zhengqing

    2018-06-01

    We consider a research scenario motivated by integrating multiple sources of information for better knowledge discovery in diverse dynamic biological processes. Given two longitudinal high-dimensional datasets for a group of subjects, we want to extract shared latent trends and identify relevant features. To solve this problem, we present a new statistical method named as joint principal trend analysis (JPTA). We demonstrate the utility of JPTA through simulations and applications to gene expression data of the mammalian cell cycle and longitudinal transcriptional profiling data in response to influenza viral infections. © 2017, The International Biometric Society.

  17. Modeling Progressive Failure of Bonded Joints Using a Single Joint Finite Element

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott E.; Waas, Anthony M.; Bednarcyk, Brett A.

    2010-01-01

    Enhanced finite elements are elements with an embedded analytical solution which can capture detailed local fields, enabling more efficient, mesh-independent finite element analysis. In the present study, an enhanced finite element is applied to generate a general framework capable of modeling an array of joint types. The joint field equations are derived using the principle of minimum potential energy, and the resulting solutions for the displacement fields are used to generate shape functions and a stiffness matrix for a single joint finite element. This single finite element thus captures the detailed stress and strain fields within the bonded joint, but it can function within a broader structural finite element model. The costs associated with a fine mesh of the joint can thus be avoided while still obtaining a detailed solution for the joint. Additionally, the capability to model non-linear adhesive constitutive behavior has been included within the method, and progressive failure of the adhesive can be modeled by using a strain-based failure criteria and re-sizing the joint as the adhesive fails. Results of the model compare favorably with experimental and finite element results.

  18. Influence of Joint Flexibility on Vibration Analysis of Free-Free Beams

    NASA Astrophysics Data System (ADS)

    Gunda, Jagadish Babu; Krishna, Y.

    2014-12-01

    In present work, joint flexibility (or looseness) of the free-free beam is investigated by using a two noded beam finite element formulation with transverse displacement and joint rotations as the degrees of freedom per node at joint location. Flexibility of the joint is primarily represented by means of a rotational spring analogy, where the stiffness of the rotational spring characterizes the looseness of the flexible joint for an applied bending moment. Influence of joint location as well as joint stiffness on modal behavior of first five modes of slender, uniform free-free beams are discussed for various values of non-dimensional rotational spring stiffness parameter. Numerical accuracy of the results obtained from the present finite element formulation are validated by using the commercially available finite element software which shows the confidence gained on the numerical results discussed in the present study.

  19. Nonlinear Analysis of Bonded Composite Tubular Lap Joints

    NASA Technical Reports Server (NTRS)

    Oterkus, E.; Madenci, E.; Smeltzer, S. S., III; Ambur, D. R.

    2005-01-01

    The present study describes a semi-analytical solution method for predicting the geometrically nonlinear response of a bonded composite tubular single-lap joint subjected to general loading conditions. The transverse shear and normal stresses in the adhesive as well as membrane stress resultants and bending moments in the adherends are determined using this method. The method utilizes the principle of virtual work in conjunction with nonlinear thin-shell theory to model the adherends and a cylindrical shear lag model to represent the kinematics of the thin adhesive layer between the adherends. The kinematic boundary conditions are imposed by employing the Lagrange multiplier method. In the solution procedure, the displacement components for the tubular joint are approximated in terms of non-periodic and periodic B-Spline functions in the longitudinal and circumferential directions, respectively. The approach presented herein represents a rapid-solution alternative to the finite element method. The solution method was validated by comparison against a previously considered tubular single-lap joint. The steep variation of both peeling and shearing stresses near the adhesive edges was successfully captured. The applicability of the present method was also demonstrated by considering tubular bonded lap-joints subjected to pure bending and torsion.

  20. Workplace air measurements and likelihood of exposure to manufactured nano-objects, agglomerates, and aggregates

    NASA Astrophysics Data System (ADS)

    Brouwer, Derk H.; van Duuren-Stuurman, Birgit; Berges, Markus; Bard, Delphine; Jankowska, Elzbieta; Moehlmann, Carsten; Pelzer, Johannes; Mark, Dave

    2013-11-01

    Manufactured nano-objects, agglomerates, and aggregates (NOAA) may have adverse effect on human health, but little is known about occupational risks since actual estimates of exposure are lacking. In a large-scale workplace air-monitoring campaign, 19 enterprises were visited and 120 potential exposure scenarios were measured. A multi-metric exposure assessment approach was followed and a decision logic was developed to afford analysis of all results in concert. The overall evaluation was classified by categories of likelihood of exposure. At task level about 53 % showed increased particle number or surface area concentration compared to "background" level, whereas 72 % of the TEM samples revealed an indication that NOAA were present in the workplace. For 54 out of the 120 task-based exposure scenarios, an overall evaluation could be made based on all parameters of the decision logic. For only 1 exposure scenario (approximately 2 %), the highest level of potential likelihood was assigned, whereas in total in 56 % of the exposure scenarios the overall evaluation revealed the lowest level of likelihood. However, for the remaining 42 % exposure to NOAA could not be excluded.

  1. A joint analysis of the Drake equation and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Prantzos, Nikos

    2013-07-01

    I propose a unified framework for a joint analysis of the Drake equation and the Fermi paradox, which enables a simultaneous, quantitative study of both of them. The analysis is based on a simplified form of the Drake equation and on a fairly simple scheme for the colonization of the Milky Way. It appears that for sufficiently long-lived civilizations, colonization of the Galaxy is the only reasonable option to gain knowledge about other life forms. This argument allows one to define a region in the parameter space of the Drake equation, where the Fermi paradox definitely holds (`Strong Fermi paradox').

  2. On non-parametric maximum likelihood estimation of the bivariate survivor function.

    PubMed

    Prentice, R L

    The likelihood function for the bivariate survivor function F, under independent censorship, is maximized to obtain a non-parametric maximum likelihood estimator &Fcirc;. &Fcirc; may or may not be unique depending on the configuration of singly- and doubly-censored pairs. The likelihood function can be maximized by placing all mass on the grid formed by the uncensored failure times, or half lines beyond the failure time grid, or in the upper right quadrant beyond the grid. By accumulating the mass along lines (or regions) where the likelihood is flat, one obtains a partially maximized likelihood as a function of parameters that can be uniquely estimated. The score equations corresponding to these point mass parameters are derived, using a Lagrange multiplier technique to ensure unit total mass, and a modified Newton procedure is used to calculate the parameter estimates in some limited simulation studies. Some considerations for the further development of non-parametric bivariate survivor function estimators are briefly described.

  3. Educational differences in likelihood of attributing breast symptoms to cancer: a vignette-based study.

    PubMed

    Marcu, Afrodita; Lyratzopoulos, Georgios; Black, Georgia; Vedsted, Peter; Whitaker, Katriina L

    2016-10-01

    Stage at diagnosis of breast cancer varies by socio-economic status (SES), with lower SES associated with poorer survival. We investigated associations between SES (indexed by education), and the likelihood of attributing breast symptoms to breast cancer. We conducted an online survey with 961 women (47-92 years) with variable educational levels. Two vignettes depicted familiar and unfamiliar breast changes (axillary lump and nipple rash). Without making breast cancer explicit, women were asked 'What do you think this […..] could be?' After the attribution question, women were asked to indicate their level of agreement with a cancer avoidance statement ('I would not want to know if I have breast cancer'). Women were more likely to mention cancer as a possible cause of an axillary lump (64%) compared with nipple rash (30%). In multivariable analysis, low and mid education were independently associated with being less likely to attribute a nipple rash to cancer (OR 0.51, 0.36-0.73 and OR 0.55, 0.40-0.77, respectively). For axillary lump, low education was associated with lower likelihood of mentioning cancer as a possible cause (OR 0.58, 0.41-0.83). Although cancer avoidance was also associated with lower education, the association between education and lower likelihood of making a cancer attribution was independent. Lower education was associated with lower likelihood of making cancer attributions for both symptoms, also after adjustment for cancer avoidance. Lower likelihood of considering cancer may delay symptomatic presentation and contribute to educational differences in stage at diagnosis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Strength evaluation of socket joints

    NASA Technical Reports Server (NTRS)

    Rash, Larry C.

    1994-01-01

    This report documents the development of a set of equations that can be used to provide a relatively simple solution for identifying the strength of socket joints and for most cases avoid the need of more lengthy analyses. The analytical approach was verified by comparison of the contact load distributions to results obtained from a finite element analysis. The contacting surfaces for the specific joint in this analysis are in the shape of frustrums of a cone and are representative of the tapered surfaces in the socket-type joints used to join segments of model support systems for wind tunnels. The results are in the form of equations that can be used to determine the contact loads and stresses in the joint from the given geometry and externally applied loads. Equations were determined to define the bending moments and stresses along the length of the joints based on strength and materials principles. The results have also been programmed for a personal computer and a copy of the program is included.

  5. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  6. Numerical analysis and experimental research of the rubber boot of the joint drive vehicle

    NASA Astrophysics Data System (ADS)

    Ziobro, Jan

    2016-04-01

    The article presents many numerical studies and experimental research of the drive rubber boot of the joint drive vehicle. Performance requirements have been discussed and the required coefficients of the mathematical model for numerical simulation have been determined. The behavior of living in MSC.MARC environment was examined. In the analysis the following have been used: hyperplastic two-parameter model of the Mooney-Rivlin material, large displacements procedure, safe contact condition, friction on the sides of the boots. 3D numerical model of the joint bootwas analyzed under influence of the forces: tensile, compressive, centrifugal and angular. Numerous results of studies have been presented. An appropriate test stand was built and comparison of the results of the numerical analysis and the results of experimental studies was made. Numerous requests and recommendations for utilitarian character have been presented.

  7. A maximum likelihood convolutional decoder model vs experimental data comparison

    NASA Technical Reports Server (NTRS)

    Chen, R. Y.

    1979-01-01

    This article describes the comparison of a maximum likelihood convolutional decoder (MCD) prediction model and the actual performance of the MCD at the Madrid Deep Space Station. The MCD prediction model is used to develop a subroutine that has been utilized by the Telemetry Analysis Program (TAP) to compute the MCD bit error rate for a given signal-to-noise ratio. The results indicate that that the TAP can predict quite well compared to the experimental measurements. An optimal modulation index also can be found through TAP.

  8. Shear joint capability versus bolt clearance

    NASA Technical Reports Server (NTRS)

    Lee, H. M.

    1992-01-01

    The results of a conservative analysis approach into the determination of shear joint strength capability for typical space-flight hardware as a function of the bolt-hole clearance specified in the design are presented. These joints are comprised of high-strength steel fasteners and abutments constructed of aluminum alloys familiar to the aerospace industry. A general analytical expression was first arrived at which relates bolt-hole clearance to the bolt shear load required to place all joint fasteners into a shear transferring position. Extension of this work allowed the analytical development of joint load capability as a function of the number of fasteners, shear strength of the bolt, bolt-hole clearance, and the desired factor of safety. Analysis results clearly indicate that a typical space-flight hardware joint can withstand significant loading when less than ideal bolt hole clearances are used in the design.

  9. On the likelihood of single-peaked preferences.

    PubMed

    Lackner, Marie-Louise; Lackner, Martin

    2017-01-01

    This paper contains an extensive combinatorial analysis of the single-peaked domain restriction and investigates the likelihood that an election is single-peaked. We provide a very general upper bound result for domain restrictions that can be defined by certain forbidden configurations. This upper bound implies that many domain restrictions (including the single-peaked restriction) are very unlikely to appear in a random election chosen according to the Impartial Culture assumption. For single-peaked elections, this upper bound can be refined and complemented by a lower bound that is asymptotically tight. In addition, we provide exact results for elections with few voters or candidates. Moreover, we consider the Pólya urn model and the Mallows model and obtain lower bounds showing that single-peakedness is considerably more likely to appear for certain parameterizations.

  10. Improvement in likelihood to donate blood after being offered a topical anesthetic.

    PubMed

    Watanabe, Kyle M; Jay, Jeffrey; Alicto, Christopher; Yamamoto, Loren G

    2011-02-01

    While there are many reasons people choose not to donate blood, pain sustained during the venipuncture portion of the blood donation process is likely one deterrent to volunteer donation. The purpose of this study was to survey the improvement in likelihood of donation if participants were given the option of a topical anesthetic cream prior to venipuncture. Over a three month period 316 adults (convenience sample) completed a one page survey consisting of twelve questions pertaining to blood donation. Participants were asked about their likelihood of donating blood in the near future (No Possibility, Possible, Likely, Certain). They were then informed of the possibility of using a topical anesthetic cream prior to donation. Subsequently, their likelihood of donating blood was reassessed. Fifty (16%) subjects reported an increased likelihood of donating blood if offered a topical anesthetic (p〈0.0001). Of these respondents reporting an increase in donation likelihood, eleven improved by 2 or more likelihood categories. Amongst the 169 participants who never donated blood, 34 (20%) reported an increased likelihood of donation after being told about the topical anesthetic cream, compared to 16 (10%) of the 147 subjects who had previously donated blood (p=0.02). The findings of this study suggest that providing a topical anesthetic had a positive effect on the study participants' likelihood of donating blood. This improvement was greater amongst those who have never donated blood. Hawaii Medical Journal Copyright 2011.

  11. Effects of joints in truss structures

    NASA Technical Reports Server (NTRS)

    Ikegami, R.

    1988-01-01

    The response of truss-type structures for future space applications, such as Large Deployable Reflector (LDR), will be directly affected by joint performance. Some of the objectives of research at BAC were to characterize structural joints, establish analytical approaches that incorporate joint characteristics, and experimentally establish the validity of the analytical approaches. The test approach to characterize joints for both erectable and deployable-type structures was based upon a Force State Mapping Technique. The approach pictorially shows how the nonlinear joint results can be used for equivalent linear analysis. Testing of the Space Station joints developed at LaRC (a hinged joint at 2 Hz and a clevis joint at 2 Hz) successfully revealed the nonlinear characteristics of the joints. The Space Station joints were effectively linear when loaded to plus or minus 500 pounds with a corresponding displacement of about plus or minus 0.0015 inch. It was indicated that good linear joints exist which are compatible with errected structures, but that difficulty may be encountered if nonlinear-type joints are incorporated in the structure.

  12. Comparison of standard maximum likelihood classification and polytomous logistic regression used in remote sensing

    Treesearch

    John Hogland; Nedret Billor; Nathaniel Anderson

    2013-01-01

    Discriminant analysis, referred to as maximum likelihood classification within popular remote sensing software packages, is a common supervised technique used by analysts. Polytomous logistic regression (PLR), also referred to as multinomial logistic regression, is an alternative classification approach that is less restrictive, more flexible, and easy to interpret. To...

  13. Evidence of seasonal variation in longitudinal growth of height in a sample of boys from Stuttgart Carlsschule, 1771-1793, using combined principal component analysis and maximum likelihood principle.

    PubMed

    Lehmann, A; Scheffler, Ch; Hermanussen, M

    2010-02-01

    Recent progress in modelling individual growth has been achieved by combining the principal component analysis and the maximum likelihood principle. This combination models growth even in incomplete sets of data and in data obtained at irregular intervals. We re-analysed late 18th century longitudinal growth of German boys from the boarding school Carlsschule in Stuttgart. The boys, aged 6-23 years, were measured at irregular 3-12 monthly intervals during the period 1771-1793. At the age of 18 years, mean height was 1652 mm, but height variation was large. The shortest boy reached 1474 mm, the tallest 1826 mm. Measured height closely paralleled modelled height, with mean difference of 4 mm, SD 7 mm. Seasonal height variation was found. Low growth rates occurred in spring and high growth rates in summer and autumn. The present study demonstrates that combining the principal component analysis and the maximum likelihood principle enables growth modelling in historic height data also. Copyright (c) 2009 Elsevier GmbH. All rights reserved.

  14. Program for Weibull Analysis of Fatigue Data

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2005-01-01

    A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

  15. Magellan/Galileo solder joint failure analysis and recommendations

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1989-01-01

    On or about November 10, 1988 an open circuit solder joint was discovered in the Magellan Radar digital unit (DFU) during integration testing at Kennedy Space Center (KSC). A detailed analysis of the cause of the failure was conducted at the Jet Propulsion Laboratory leading to the successful repair of many pieces of affected electronic hardware on both the Magellan and Galileo spacecraft. The problem was caused by the presence of high thermal coefficient of expansion heat sink and conformal coating materials located in the large (0.055 inch) gap between Dual Inline Packages (DIPS) and the printed wiring board. The details of the observed problems are described and recommendations are made for improved design and testing activities in the future.

  16. Maximum likelihood estimation of finite mixture model for economic data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  17. Debonding of Stitched Composite Joints: Testing and Analysis

    NASA Technical Reports Server (NTRS)

    Glaessgen, E. H.; Raju, I. S.; Poe, C. C., Jr.

    1999-01-01

    The effect of stitches on the failure of a single lap joint configuration was determined in a combined experimental and analytical study. The experimental study was conducted to determine debond growth under static monotonic loading. The stitches were shown to delay the initiation ofthe debond and provide load transfer beyond the load necessary to completely debond the stitched lap joint. The strain energy release rates at the debond front were calculated using a finite element-based technique. Models of the unstitched configuration showed significant values of modes I and II across the width of the joint and showed that mode III is zero at the centerline but increases near the free edge. Models of the stitched configuration showed that the stitches effectively reduced mode I to zero, but had less of an effect on modes II and III.

  18. Incorporating real-time traffic and weather data to explore road accident likelihood and severity in urban arterials.

    PubMed

    Theofilatos, Athanasios

    2017-06-01

    The effective treatment of road accidents and thus the enhancement of road safety is a major concern to societies due to the losses in human lives and the economic and social costs. The investigation of road accident likelihood and severity by utilizing real-time traffic and weather data has recently received significant attention by researchers. However, collected data mainly stem from freeways and expressways. Consequently, the aim of the present paper is to add to the current knowledge by investigating accident likelihood and severity by exploiting real-time traffic and weather data collected from urban arterials in Athens, Greece. Random Forests (RF) are firstly applied for preliminary analysis purposes. More specifically, it is aimed to rank candidate variables according to their relevant importance and provide a first insight on the potential significant variables. Then, Bayesian logistic regression as well finite mixture and mixed effects logit models are applied to further explore factors associated with accident likelihood and severity respectively. Regarding accident likelihood, the Bayesian logistic regression showed that variations in traffic significantly influence accident occurrence. On the other hand, accident severity analysis revealed a generally mixed influence of traffic variations on accident severity, although international literature states that traffic variations increase severity. Lastly, weather parameters did not find to have a direct influence on accident likelihood or severity. The study added to the current knowledge by incorporating real-time traffic and weather data from urban arterials to investigate accident occurrence and accident severity mechanisms. The identification of risk factors can lead to the development of effective traffic management strategies to reduce accident occurrence and severity of injuries in urban arterials. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  19. Localising semantic and syntactic processing in spoken and written language comprehension: an Activation Likelihood Estimation meta-analysis.

    PubMed

    Rodd, Jennifer M; Vitello, Sylvia; Woollams, Anna M; Adank, Patti

    2015-02-01

    We conducted an Activation Likelihood Estimation (ALE) meta-analysis to identify brain regions that are recruited by linguistic stimuli requiring relatively demanding semantic or syntactic processing. We included 54 functional MRI studies that explicitly varied the semantic or syntactic processing load, while holding constant demands on earlier stages of processing. We included studies that introduced a syntactic/semantic ambiguity or anomaly, used a priming manipulation that specifically reduced the load on semantic/syntactic processing, or varied the level of syntactic complexity. The results confirmed the critical role of the posterior left Inferior Frontal Gyrus (LIFG) in semantic and syntactic processing. These results challenge models of sentence comprehension highlighting the role of anterior LIFG for semantic processing. In addition, the results emphasise the posterior (but not anterior) temporal lobe for both semantic and syntactic processing. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  20. Fluid flow analysis of E-glass fiber reinforced pipe joints in oil and gas industry

    NASA Astrophysics Data System (ADS)

    Bobba, Sujith; Leman, Z.; Zainuddin, E. S.; Sapuan, S. M.

    2018-04-01

    Glass Fiber reinforced composites have become increasingly important over the past few years and now they are the first choice materials for fabricating pipes with low weight in combination with high strength and stiffness. In Oil And Gas Industry, The Pipelines transporting heavy crude oil are subjected to variable pressure waves causing fluctuating stress levels in the pipes. Computational Fluid Dynamics (CFD) analysis was performed using solid works flow stimulation software to study the effects of these pressure waves on some specified joints in the pipes. Depending on the type of heavy crude oil being used, the flow behavior indicated a considerable degree of stress levels in certain connecting joints, causing the joints to become weak over a prolonged period of use. This research proposes a new perspective that is still required to be developed regarding the change of the pipe material, fiber winding angle in those specified joints and finally implementing cad wind technology to check the output result of the stress levels so that the life of the pipes can be optimized.

  1. Preloaded joint analysis methodology for space flight systems

    NASA Technical Reports Server (NTRS)

    Chambers, Jeffrey A.

    1995-01-01

    This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.

  2. Joint Personnel Recovery Agency Joint Center for Operational Analysis and Lessons Learned Quarterly Bulletin, Volume 7, Issue 2, March 2005

    DTIC Science & Technology

    2005-03-01

    execute these dangerous and uncertain missions. iv In my recent travels in the U.S. Central Command area of operations I had the great fortune of meeting...jfcom.mil 1Joint Center for Operational Analysis and Lessons Learned (JCOA-LL) Bulletin “That others may live…to return with honor” The old Chinese ...information has to travel to meet GCC staff requirements increases the difficulty in handling and maintaining situational awareness on PR events

  3. Approximated maximum likelihood estimation in multifractal random walks

    NASA Astrophysics Data System (ADS)

    Løvsletten, O.; Rypdal, M.

    2012-04-01

    We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry , Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.64.026103 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the r computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.

  4. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    PubMed

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  5. Joint Analysis of the Full AzTEC Sub-Millimeter Galaxy Data Set

    NASA Astrophysics Data System (ADS)

    Wilson, Grant; Ade, P.; Aretxaga, I.; Austermann, J.; Bock, J.; Hughes, D.; Kang, Y.; Kim, S.; Lowenthal, J.; Mauskopf, P.; Perera, T.; Scott, K.; Yun, M.

    2006-12-01

    Using the new AzTEC millimeter-wave camera on the James Clerk Maxwell Telescope (JCMT) in winter 2005/06, we conducted several surveys of the submm galaxy (SMG) population. The AzTEC 1.1 millimeter surveys include both blank-fields (no significant bias or foreground contamination) and regions of known over-densities, and are both large (100-1000 sq. arcmin.) and sensitive ( 1 mJy rms). The unique power of the AzTEC data set lies not only in the size and depth of the individual fields, but in the combined surveyed area that totals over 1 square degree. Hundreds of new sub-millimeter sources have been detected. A joint analysis of all AzTEC surveys will provide important new constraints on many characteristics of the SMG population, including number counts, clustering, and variance. In particular, the large area of the full AzTEC data set provides the first significant measurement of the brightest and most rare of the SMG population. Herein we present the initial combined results and explore the future potential of a complete joint analysis of the full AzTEC SMG data set.

  6. On the existence of maximum likelihood estimates for presence-only data

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.

    2015-01-01

    It is important to identify conditions for which maximum likelihood estimates are unlikely to be identifiable from presence-only data. In data sets where the maximum likelihood estimates do not exist, penalized likelihood and Bayesian methods will produce coefficient estimates, but these are sensitive to the choice of estimation procedure and prior or penalty term. When sample size is small or it is thought that habitat preferences are strong, we propose a suite of estimation procedures researchers can consider using.

  7. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  8. Moral Identity Predicts Doping Likelihood via Moral Disengagement and Anticipated Guilt.

    PubMed

    Kavussanu, Maria; Ring, Christopher

    2017-08-01

    In this study, we integrated elements of social cognitive theory of moral thought and action and the social cognitive model of moral identity to better understand doping likelihood in athletes. Participants (N = 398) recruited from a variety of team sports completed measures of moral identity, moral disengagement, anticipated guilt, and doping likelihood. Moral identity predicted doping likelihood indirectly via moral disengagement and anticipated guilt. Anticipated guilt about potential doping mediated the relationship between moral disengagement and doping likelihood. Our findings provide novel evidence to suggest that athletes, who feel that being a moral person is central to their self-concept, are less likely to use banned substances due to their lower tendency to morally disengage and the more intense feelings of guilt they expect to experience for using banned substances.

  9. Maximum likelihood of phylogenetic networks.

    PubMed

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2006-11-01

    Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf

  10. Risk Presentation Using the Three Dimensions of Likelihood, Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford

    2010-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the leastwell-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.

  11. Comparison of two weighted integration models for the cueing task: linear and likelihood

    NASA Technical Reports Server (NTRS)

    Shimozaki, Steven S.; Eckstein, Miguel P.; Abbey, Craig K.

    2003-01-01

    In a task in which the observer must detect a signal at two locations, presenting a precue that predicts the location of a signal leads to improved performance with a valid cue (signal location matches the cue), compared to an invalid cue (signal location does not match the cue). The cue validity effect has often been explained with a limited capacity attentional mechanism improving the perceptual quality at the cued location. Alternatively, the cueing effect can also be explained by unlimited capacity models that assume a weighted combination of noisy responses across the two locations. We compare two weighted integration models, a linear model and a sum of weighted likelihoods model based on a Bayesian observer. While qualitatively these models are similar, quantitatively they predict different cue validity effects as the signal-to-noise ratios (SNR) increase. To test these models, 3 observers performed in a cued discrimination task of Gaussian targets with an 80% valid precue across a broad range of SNR's. Analysis of a limited capacity attentional switching model was also included and rejected. The sum of weighted likelihoods model best described the psychophysical results, suggesting that human observers approximate a weighted combination of likelihoods, and not a weighted linear combination.

  12. Cohesive Laws and Progressive Damage Analysis of Composite Bonded Joints, a Combined Numerical/Experimental Approach

    NASA Technical Reports Server (NTRS)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2015-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations, in agreement with experimental tests, indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  13. Temporal gene expression profiling of the rat knee joint capsule during immobilization-induced joint contractures.

    PubMed

    Wong, Kayleigh; Sun, Fangui; Trudel, Guy; Sebastiani, Paola; Laneuville, Odette

    2015-05-26

    Contractures of the knee joint cause disability and handicap. Recovering range of motion is recognized by arthritic patients as their preference for improved health outcome secondary only to pain management. Clinical and experimental studies provide evidence that the posterior knee capsule prevents the knee from achieving full extension. This study was undertaken to investigate the dynamic changes of the joint capsule transcriptome during the progression of knee joint contractures induced by immobilization. We performed a microarray analysis of genes expressed in the posterior knee joint capsule following induction of a flexion contracture by rigidly immobilizing the rat knee joint over a time-course of 16 weeks. Fold changes of expression values were measured and co-expressed genes were identified by clustering based on time-series analysis. Genes associated with immobilization were further analyzed to reveal pathways and biological significance and validated by immunohistochemistry on sagittal sections of knee joints. Changes in expression with a minimum of 1.5 fold changes were dominated by a decrease in expression for 7732 probe sets occurring at week 8 while the expression of 2251 probe sets increased. Clusters of genes with similar profiles of expression included a total of 162 genes displaying at least a 2 fold change compared to week 1. Functional analysis revealed ontology categories corresponding to triglyceride metabolism, extracellular matrix and muscle contraction. The altered expression of selected genes involved in the triglyceride biosynthesis pathway; AGPAT-9, and of the genes P4HB and HSP47, both involved in collagen synthesis, was confirmed by immunohistochemistry. Gene expression in the knee joint capsule was sensitive to joint immobility and provided insights into molecular mechanisms relevant to the pathophysiology of knee flexion contractures. Capsule responses to immobilization was dynamic and characterized by modulation of at least three

  14. Uncued Low SNR Detection with Likelihood from Image Multi Bernoulli Filter

    NASA Astrophysics Data System (ADS)

    Murphy, T.; Holzinger, M.

    2016-09-01

    Both SSA and SDA necessitate uncued, partially informed detection and orbit determination efforts for small space objects which often produce only low strength electro-optical signatures. General frame to frame detection and tracking of objects includes methods such as moving target indicator, multiple hypothesis testing, direct track-before-detect methods, and random finite set based multiobject tracking. This paper will apply the multi-Bernoilli filter to low signal-to-noise ratio (SNR), uncued detection of space objects for space domain awareness applications. The primary novel innovation in this paper is a detailed analysis of the existing state-of-the-art likelihood functions and a likelihood function, based on a binary hypothesis, previously proposed by the authors. The algorithm is tested on electro-optical imagery obtained from a variety of sensors at Georgia Tech, including the GT-SORT 0.5m Raven-class telescope, and a twenty degree field of view high frame rate CMOS sensor. In particular, a data set of an extended pass of the Hitomi Astro-H satellite approximately 3 days after loss of communication and potential break up is examined.

  15. Idealized models of the joint probability distribution of wind speeds

    NASA Astrophysics Data System (ADS)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  16. Maximum likelihood estimation for periodic autoregressive moving average models

    USGS Publications Warehouse

    Vecchia, A.V.

    1985-01-01

    A useful class of models for seasonal time series that cannot be filtered or standardized to achieve second-order stationarity is that of periodic autoregressive moving average (PARMA) models, which are extensions of ARMA models that allow periodic (seasonal) parameters. An approximation to the exact likelihood for Gaussian PARMA processes is developed, and a straightforward algorithm for its maximization is presented. The algorithm is tested on several periodic ARMA(1, 1) models through simulation studies and is compared to moment estimation via the seasonal Yule-Walker equations. Applicability of the technique is demonstrated through an analysis of a seasonal stream-flow series from the Rio Caroni River in Venezuela.

  17. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    USGS Publications Warehouse

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  18. Predictors of Self-Reported Likelihood of Working with Older Adults

    ERIC Educational Resources Information Center

    Eshbaugh, Elaine M.; Gross, Patricia E.; Satrom, Tatum

    2010-01-01

    This study examined the self-reported likelihood of working with older adults in a future career among 237 college undergraduates at a midsized Midwestern university. Although aging anxiety was not significantly related to likelihood of working with older adults, those students who had a greater level of death anxiety were less likely than other…

  19. Image segmentation and registration for the analysis of joint motion from 3D MRI

    NASA Astrophysics Data System (ADS)

    Hu, Yangqiu; Haynor, David R.; Fassbind, Michael; Rohr, Eric; Ledoux, William

    2006-03-01

    We report an image segmentation and registration method for studying joint morphology and kinematics from in vivo MRI scans and its application to the analysis of ankle joint motion. Using an MR-compatible loading device, a foot was scanned in a single neutral and seven dynamic positions including maximal flexion, rotation and inversion/eversion. A segmentation method combining graph cuts and level sets was developed which allows a user to interactively delineate 14 bones in the neutral position volume in less than 30 minutes total, including less than 10 minutes of user interaction. In the subsequent registration step, a separate rigid body transformation for each bone is obtained by registering the neutral position dataset to each of the dynamic ones, which produces an accurate description of the motion between them. We have processed six datasets, including 3 normal and 3 pathological feet. For validation our results were compared with those obtained from 3DViewnix, a semi-automatic segmentation program, and achieved good agreement in volume overlap ratios (mean: 91.57%, standard deviation: 3.58%) for all bones. Our tool requires only 1/50 and 1/150 of the user interaction time required by 3DViewnix and NIH Image Plus, respectively, an improvement that has the potential to make joint motion analysis from MRI practical in research and clinical applications.

  20. Biomechanical analysis of acromioclavicular joint dislocation treated with clavicle hook plates in different lengths.

    PubMed

    Shih, Cheng-Min; Huang, Kui-Chou; Pan, Chien-Chou; Lee, Cheng-Hung; Su, Kuo-Chih

    2015-11-01

    Clavicle hook plates are frequently used in clinical orthopaedics to treat acromioclavicular joint dislocation. However, patients often exhibit acromion osteolysis and per-implant fracture after undergoing hook plate fixation. With the intent of avoiding future complications or fixation failure after clavicle hook plate fixation, we used finite element analysis (FEA) to investigate the biomechanics of clavicle hook plates of different materials and sizes when used in treating acromioclavicular joint dislocation. Using finite element analysis, this study constructed a model comprising four parts: clavicle, acromion, clavicle hook plate and screws, and used the model to simulate implanting different types of clavicle hook plates in patients with acromioclavicular joint dislocation. Then, the biomechanics of stainless steel and titanium alloy clavicle hook plates containing either six or eight screw holes were investigated. The results indicated that using a longer clavicle hook plate decreased the stress value in the clavicle, and mitigated the force that clavicle hook plates exert on the acromion. Using a clavicle hook plate material characterized by a smaller Young's modulus caused a slight increase in the stress on the clavicle. However, the external force the material imposed on the acromion was less than the force exerted on the clavicle. The findings of this study can serve as a reference to help orthopaedic surgeons select clavicle hook plates.

  1. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    PubMed

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  2. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  3. Calibration of two complex ecosystem models with different likelihood functions

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    goodness metric on calibration. The different likelihoods are different functions of RMSE (root mean squared error) weighted by measurement uncertainty: exponential / linear / quadratic / linear normalized by correlation. As a first calibration step sensitivity analysis was performed in order to select the influential parameters which have strong effect on the output data. In the second calibration step only the sensitive parameters were calibrated (optimal values and confidence intervals were calculated). In case of PaSim more parameters were found responsible for the 95% of the output data variance than is case of BBGC MuSo. Analysis of the results of the optimized models revealed that the exponential likelihood estimation proved to be the most robust (best model simulation with optimized parameter, highest confidence interval increase). The cross-validation of the model simulations can help in constraining the highly uncertain greenhouse gas budget of grasslands.

  4. Structural analysis of three space crane articulated-truss joint concepts

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey; Sutter, Thomas R.

    1992-01-01

    Three space crane articulated truss joint concepts are studied to evaluate their static structural performance over a range of geometric design parameters. Emphasis is placed on maintaining the four longeron reference truss performance across the joint while allowing large angle articulation. A maximum positive articulation angle and the actuator length ratio required to reach the angle are computed for each concept as the design parameters are varied. Configurations with a maximum articulation angle less than 120 degrees or actuators requiring a length ratio over two are not considered. Tip rotation and lateral deflection of a truss beam with an articulated truss joint at the midspan are used to select a point design for each concept. Deflections for one point design are up to 40 percent higher than for the other two designs. Dynamic performance of the three point design is computed as a function of joint articulation angle. The two lowest frequencies of each point design are relatively insensitive to large variations in joint articulation angle. One point design has a higher maximum tip velocity for the emergency stop than the other designs.

  5. Estimation of joint stiffness with a compliant load.

    PubMed

    Ludvig, Daniel; Kearney, Robert E

    2009-01-01

    Joint stiffness defines the dynamic relationship between the position of the joint and the torque acting about it. It consists of two components: intrinsic and reflex stiffness. Many previous studies have investigated joint stiffness in an open-loop environment, because the current algorithm in use is an open-loop algorithm. This paper explores issues related to the estimation of joint stiffness when subjects interact with compliant loads. First, we show analytically how the bias in closed-loop estimates of joint stiffness depends on the properties of the load, the noise power, and length of the estimated impulse response functions (IRF). We then demonstrate with simulations that the open-loop analysis will fail completely for an elastic load but may succeed for an inertial load. We further show that the open-loop analysis can yield unbiased results with an inertial load and document IRF length, signal-to-noise ratio needed, and minimum inertia needed for the analysis to succeed. Thus, by using a load with a properly selected inertia, open-loop analysis can be used under closed-loop conditions.

  6. Planck 2015 results: XI. CMB power spectra, likelihoods, and robustness of parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghanim, N.; Arnaud, M.; Ashdown, M.

    This study presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (ℓ< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy broughtmore » by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the

  7. Planck 2015 results. XI. CMB power spectra, likelihoods, and robustness of parameters

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chiang, H. C.; Christensen, P. R.; Clements, D. L.; Colombo, L. P. L.; Combet, C.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Di Valentino, E.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Gerbino, M.; Giard, M.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hamann, J.; Hansen, F. K.; Harrison, D. L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Holmes, W. A.; Hornstrup, A.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Le Jeune, M.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Lewis, A.; Liguori, M.; Lilje, P. B.; Lilley, M.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Melchiorri, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Mottet, S.; Munshi, D.; Murphy, J. A.; Narimani, A.; Naselsky, P.; Nati, F.; Natoli, P.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rouillé d'Orfeuil, B.; Rubiño-Martín, J. A.; Rusholme, B.; Salvati, L.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Serra, P.; Spencer, L. D.; Spinelli, M.; Stolyarov, V.; Stompor, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, I.e., a pixel-based likelihood at low multipoles (ℓ< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, ns, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the

  8. Planck 2015 results: XI. CMB power spectra, likelihoods, and robustness of parameters

    DOE PAGES

    Aghanim, N.; Arnaud, M.; Ashdown, M.; ...

    2016-09-20

    This study presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (ℓ< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy broughtmore » by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the

  9. Theoretical Analysis of Penalized Maximum-Likelihood Patlak Parametric Image Reconstruction in Dynamic PET for Lesion Detection.

    PubMed

    Yang, Li; Wang, Guobao; Qi, Jinyi

    2016-04-01

    Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.

  10. Functional magnetic resonance imaging during emotion recognition in social anxiety disorder: an activation likelihood meta-analysis

    PubMed Central

    Hattingh, Coenraad J.; Ipser, J.; Tromp, S. A.; Syal, S.; Lochner, C.; Brooks, S. J.; Stein, D. J.

    2012-01-01

    Background: Social anxiety disorder (SAD) is characterized by abnormal fear and anxiety in social situations. Functional magnetic resonance imaging (fMRI) is a brain imaging technique that can be used to demonstrate neural activation to emotionally salient stimuli. However, no attempt has yet been made to statistically collate fMRI studies of brain activation, using the activation likelihood-estimate (ALE) technique, in response to emotion recognition tasks in individuals with SAD. Methods: A systematic search of fMRI studies of neural responses to socially emotive cues in SAD was undertaken. ALE meta-analysis, a voxel-based meta-analytic technique, was used to estimate the most significant activations during emotional recognition. Results: Seven studies were eligible for inclusion in the meta-analysis, constituting a total of 91 subjects with SAD, and 93 healthy controls. The most significant areas of activation during emotional vs. neutral stimuli in individuals with SAD compared to controls were: bilateral amygdala, left medial temporal lobe encompassing the entorhinal cortex, left medial aspect of the inferior temporal lobe encompassing perirhinal cortex and parahippocampus, right anterior cingulate, right globus pallidus, and distal tip of right postcentral gyrus. Conclusion: The results are consistent with neuroanatomic models of the role of the amygdala in fear conditioning, and the importance of the limbic circuitry in mediating anxiety symptoms. PMID:23335892

  11. Experimental Investigation and Finite Element Analysis on Fatigue Behavior of Aluminum Alloy 7050 Single-Lap Joints

    NASA Astrophysics Data System (ADS)

    Zhou, Bing; Cui, Hao; Liu, Haibo; Li, Yang; Liu, Gaofeng; Li, Shujun; Zhang, Shangzhou

    2018-03-01

    The fatigue behavior of single-lap four-riveted aluminum alloy 7050 joints was investigated by using high-frequency fatigue test and scanning electron microscope (SEM). Stress distributions obtained by finite element (FE) analysis help explain the fatigue performance. The fatigue test results showed that the fatigue lives of the joints depend on cold expansion and applied cyclic loads. FE analysis and fractography indicated that the improved fatigue lives can be attributed to the reduction in maximum stress and evolution of fatigue damage at the critical location. The beneficial effects of strengthening techniques result in tearing ridges or lamellar structure on fracture surface, decrease in fatigue striations spacing, delay of fatigue crack initiation, crack deflection in fatigue crack propagation and plasticity-induced crack closure.

  12. Maximum likelihood phase-retrieval algorithm: applications.

    PubMed

    Nahrstedt, D A; Southwell, W H

    1984-12-01

    The maximum likelihood estimator approach is shown to be effective in determining the wave front aberration in systems involving laser and flow field diagnostics and optical testing. The robustness of the algorithm enables convergence even in cases of severe wave front error and real, nonsymmetrical, obscured amplitude distributions.

  13. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    PubMed

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  14. Imaging and Analysis of Void-defects in Solder Joints Formed in Reduced Gravity using High-Resolution Computed Tomography

    NASA Technical Reports Server (NTRS)

    Easton, John W.; Struk, Peter M.; Rotella, Anthony

    2008-01-01

    As a part of efforts to develop an electronics repair capability for long duration space missions, techniques and materials for soldering components on a circuit board in reduced gravity must be developed. This paper presents results from testing solder joint formation in low gravity on a NASA Reduced Gravity Research Aircraft. The results presented include joints formed using eutectic tin-lead solder and one of the following fluxes: (1) a no-clean flux core, (2) a rosin flux core, and (3) a solid solder wire with external liquid no-clean flux. The solder joints are analyzed with a computed tomography (CT) technique which imaged the interior of the entire solder joint. This replaced an earlier technique that required the solder joint to be destructively ground down revealing a single plane which was subsequently analyzed. The CT analysis technique is described and results presented with implications for future testing as well as implications for the overall electronics repair effort discussed.

  15. Exclusion probabilities and likelihood ratios with applications to mixtures.

    PubMed

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  16. Eigenvalue sensitivity analysis of planar frames with variable joint and support locations

    NASA Technical Reports Server (NTRS)

    Chuang, Ching H.; Hou, Gene J. W.

    1991-01-01

    Two sensitivity equations are derived in this study based upon the continuum approach for eigenvalue sensitivity analysis of planar frame structures with variable joint and support locations. A variational form of an eigenvalue equation is first derived in which all of the quantities are expressed in the local coordinate system attached to each member. Material derivative of this variational equation is then sought to account for changes in member's length and orientation resulting form the perturbation of joint and support locations. Finally, eigenvalue sensitivity equations are formulated in either domain quantities (by the domain method) or boundary quantities (by the boundary method). It is concluded that the sensitivity equation derived by the boundary method is more efficient in computation but less accurate than that of the domain method. Nevertheless, both of them in terms of computational efficiency are superior to the conventional direct differentiation method and the finite difference method.

  17. A comprehensive assessment of collision likelihood in Geosynchronous Earth Orbit

    NASA Astrophysics Data System (ADS)

    Oltrogge, D. L.; Alfano, S.; Law, C.; Cacioni, A.; Kelso, T. S.

    2018-06-01

    Knowing the likelihood of collision for satellites operating in Geosynchronous Earth Orbit (GEO) is of extreme importance and interest to the global community and the operators of GEO spacecraft. Yet for all of its importance, a comprehensive assessment of GEO collision likelihood is difficult to do and has never been done. In this paper, we employ six independent and diverse assessment methods to estimate GEO collision likelihood. Taken in aggregate, this comprehensive assessment offer new insights into GEO collision likelihood that are within a factor of 3.5 of each other. These results are then compared to four collision and seven encounter rate estimates previously published. Collectively, these new findings indicate that collision likelihood in GEO is as much as four orders of magnitude higher than previously published by other researchers. Results indicate that a collision is likely to occur every 4 years for one satellite out of the entire GEO active satellite population against a 1 cm RSO catalogue, and every 50 years against a 20 cm RSO catalogue. Further, previous assertions that collision relative velocities are low (i.e., <1 km/s) in GEO are disproven, with some GEO relative velocities as high as 4 km/s identified. These new findings indicate that unless operators successfully mitigate this collision risk, the GEO orbital arc is and will remain at high risk of collision, with the potential for serious follow-on collision threats from post-collision debris when a substantial GEO collision occurs.

  18. Neural network expert system for X-ray analysis of welded joints

    NASA Astrophysics Data System (ADS)

    Kozlov, V. V.; Lapik, N. V.; Popova, N. V.

    2018-03-01

    The use of intelligent technologies for the automated analysis of product quality is one of the main trends in modern machine building. At the same time, rapid development in various spheres of human activity is experienced by methods associated with the use of artificial neural networks, as the basis for building automated intelligent diagnostic systems. Technologies of machine vision allow one to effectively detect the presence of certain regularities in the analyzed designation, including defects of welded joints according to radiography data.

  19. Crystal plasticity finite element analysis of deformation behaviour in SAC305 solder joint

    NASA Astrophysics Data System (ADS)

    Darbandi, Payam

    Due to the awareness of the potential health hazards associated with the toxicity of lead (Pb), actions have been taken to eliminate or reduce the use of Pb in consumer products. Among those, tin (Sn) solders have been used for the assembly of electronic systems. Anisotropy is of significant importance in all structural metals, but this characteristic is unusually strong in Sn, making Sn based solder joints one of the best examples of the influence of anisotropy. The effect of anisotropy arising from the crystal structure of tin and large grain microstructure on the microstructure and the evolution of constitutive responses of microscale SAC305 solder joints is investigated. Insights into the effects of key microstructural features and dominant plastic deformation mechanisms influencing the measured relative activity of slip systems in SAC305 are obtained from a combination of optical microscopy, orientation imaging microscopy (OIM), slip plane trace analysis and crystal plasticity finite element (CPFE) modeling. Package level SAC305 specimens were subjected to shear deformation in sequential steps and characterized using optical microscopy and OIM to identify the activity of slip systems. X-ray micro Laue diffraction and high energy monochromatic X-ray beam were employed to characterize the joint scale tensile samples to provide necessary information to be able to compare and validate the CPFE model. A CPFE model was developed that can account for relative ease of activating slip systems in SAC305 solder based upon the statistical estimation based on correlation between the critical resolved shear stress and the probability of activating various slip systems. The results from simulations show that the CPFE model developed using the statistical analysis of activity of slip system not only can satisfy the requirements associated with kinematic of plastic deformation in crystal coordinate systems (activity of slip systems) and global coordinate system (shape changes

  20. A Space Object Detection Algorithm using Fourier Domain Likelihood Ratio Test

    NASA Astrophysics Data System (ADS)

    Becker, D.; Cain, S.

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. Detection algorithms employed play a crucial role in fulfilling the detection component in the situational awareness mission to detect, track, characterize and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follows a Gaussian distribution. This paper explores the potential for detection performance advantages when operating in the Fourier domain of long exposure images of small and/or dim space objects from ground based telescopes. A binary hypothesis test is developed based on the joint probability distribution function of the image under the hypothesis that an object is present and under the hypothesis that the image only contains background noise. The detection algorithm tests each pixel point of the Fourier transformed images to make the determination if an object is present based on the criteria threshold found in the likelihood ratio test. Using simulated data, the performance of the Fourier domain detection algorithm is compared to the current algorithm used in space situational awareness applications to evaluate its value.

  1. Pathological Knee Joint Motion Analysis By High Speed Cinephotography

    NASA Astrophysics Data System (ADS)

    Baumann, Jurg U.

    1985-02-01

    The use of cinephotography for evaluation of disturbed knee joint function was compared in three groups of patients. While a sampling rate of 50 images per second was adequate for patients with neuromuscular disorders, a higher frequency of around 300 i.p.s. is necessary in osteoarthritis and ligamentous knee joint injuries, but the task of digitizing is prohibitive unless automated.

  2. Empirical likelihood-based tests for stochastic ordering

    PubMed Central

    BARMI, HAMMOU EL; MCKEAGUE, IAN W.

    2013-01-01

    This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142

  3. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  4. Natural progression of blood-induced joint damage in patients with haemophilia: clinical relevance and reproducibility of three-dimensional gait analysis.

    PubMed

    Lobet, S; Detrembleur, C; Francq, B; Hermans, C

    2010-09-01

    A major complication in haemophilia is the destruction of joint cartilage because of recurrent intraarticular and intramuscular bleeds. Therefore, joint assessment is critical to quantify the extent of joint damage, which has traditionally been evaluated using both radiological and clinical joint scores. Our study aimed to evaluate the natural progression of haemophilic arthopathy using three-dimensional gait analysis (3DGA) and to assess the reproducibility of this technique. We hypothesized that the musculoskeletal function was relatively stable in patients with haemophilia. Eighteen adults with established haemophilic arthropathies were evaluated twice by 3DGA (mean follow-up: 18 +/- 5 weeks). Unexpectedly, our findings revealed infraclinical deterioration of gait pattern, characterized by a 3.2% decrease in the recovery index, which is indicative of the subject's ability to save energy while walking. A tendency towards modification of segmental joint function was also observed. Gait analysis was sufficiently reproducible with regards to spatiotemporal parameters as well as kinetic, mechanical and energetic gait variables. The kinematic variables were reproducible in both the sagittal and frontal planes. In conclusion, 3DGA is a reproducible tool to assess abnormal gait patterns and monitor natural disease progression in haemophilic patients.

  5. Closed-loop carrier phase synchronization techniques motivated by likelihood functions

    NASA Technical Reports Server (NTRS)

    Tsou, H.; Hinedi, S.; Simon, M.

    1994-01-01

    This article reexamines the notion of closed-loop carrier phase synchronization motivated by the theory of maximum a posteriori phase estimation with emphasis on the development of new structures based on both maximum-likelihood and average-likelihood functions. The criterion of performance used for comparison of all the closed-loop structures discussed is the mean-squared phase error for a fixed-loop bandwidth.

  6. Strength Variation of Parachute Joints

    NASA Technical Reports Server (NTRS)

    Mollmann, Catherine

    2017-01-01

    A parachute joint is defined as a location where a component is sewn or connected to another component. During the design and developmental phase of a parachute system, the joints for each structural component are isolated and tested through a process called seam and joint testing. The objective of seam and joint testing is to determine the degradation on a single component due to interaction with other components; this data is then used when calculating the margin of safety for that component. During the engineering developmental phase of CPAS (Capsule Parachute Assembly System), the parachute system for the NASA Orion Crew Module, testing was completed for every joint of the six subsystems: the four parachutes (main, drogue, pilot, and FBCP [forward bay cover parachute]), the retention release bridle, and the retention panels. The number of joint tests for these subsystems totaled 92, which provides a plethora of data and results for further analysis. In this paper, the data and results of these seam and joint tests are examined to determine the effects, if any, of different operators and sewing machines on the strength of parachute joints. Other variables are also studied to determine their effect on joint strength, such as joint complexity, joint strength magnitude, material type, and material construction. Findings reveal that an optimally-run seam and joint test program could result in an increased understanding of the structure of the parachute; this should lead to a parachute built with optimal components, potentially saving system weight and volume.

  7. Assessing the likelihood of volcanic eruption through analysis of volcanotectonic earthquake fault plane solutions

    NASA Astrophysics Data System (ADS)

    Roman, D. C.; Neuberg, J.; Luckett, R. R.

    2006-08-01

    Episodes of volcanic unrest do not always lead to an eruption. Many of the commonly monitored signals of volcanic unrest, including surface deformation and increased degassing, can reflect perturbations to a deeper magma storage system, and may persist for years without accompanying eruptive activity. Signals of volcanic unrest can also persist following the end of an eruption. Furthermore, the most reliable eruption precursor, the occurrence of low-frequency seismicity, appears to reflect very shallow processes and typically precedes eruptions by only hours to days. Thus, the identification of measurable and unambiguous indicators that are sensitive to changes in the mid-level conduit system during an intermediate stage of magma ascent is of critical importance to the field of volcano monitoring. Here, using data from the ongoing eruption of the Soufrière Hills Volcano, Montserrat, we show that ˜90° changes in the orientation of double-couple fault-plane solutions for high-frequency 'volcanotectonic' (VT) earthquakes reflect pressurization of the mid-level conduit system prior to eruption and may precede the onset of eruptive episodes by weeks to months. Our results demonstrate that, once the characteristic stress field response to magma ascent at a given volcano is established, a relatively simple analysis of VT fault-plane solutions may be used to make intermediate-term assessments of the likelihood of future eruptive activity.

  8. Dynamic Analyses Including Joints Of Truss Structures

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith

    1991-01-01

    Method for mathematically modeling joints to assess influences of joints on dynamic response of truss structures developed in study. Only structures with low-frequency oscillations considered; only Coulomb friction and viscous damping included in analysis. Focus of effort to obtain finite-element mathematical models of joints exhibiting load-vs.-deflection behavior similar to measured load-vs.-deflection behavior of real joints. Experiments performed to determine stiffness and damping nonlinearities typical of joint hardware. Algorithm for computing coefficients of analytical joint models based on test data developed to enable study of linear and nonlinear effects of joints on global structural response. Besides intended application to large space structures, applications in nonaerospace community include ground-based antennas and earthquake-resistant steel-framed buildings.

  9. Likelihood ratio decisions in memory: three implied regularities.

    PubMed

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  10. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    PubMed

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  11. Analysis and testing of a space crane articulating joint testbed

    NASA Technical Reports Server (NTRS)

    Sutter, Thomas R.; Wu, K. Chauncey

    1992-01-01

    The topics are presented in viewgraph form and include: space crane concept with mobile base; mechanical versus structural articulating joint; articulating joint test bed and reference truss; static and dynamic characterization completed for space crane reference truss configuration; improved linear actuators reduce articulating joint test bed backlash; 1-DOF space crane slew maneuver; boom 2 tip transient response finite element dynamic model; boom 2 tip transient response shear-corrected component modes torque driver profile; peak root member force vs. slew time torque driver profile; and open loop control of space crane motion.

  12. Axisymmetric shell analysis of the Space Shuttle solid rocket booster field joint

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.; Anderson, Melvin S.

    1989-01-01

    The Space Shuttle Challenger (STS 51-L) accident led to an intense investigation of the structural behavior of the solid rocket booster (SRB) tang and clevis field joints. The presence of structural deformations between the clevis inner leg and the tang, substantial enough to prevent the O-ring seals from eliminating hot gas flow through the joints, has emerged as a likely cause of the vehicle failure. This paper presents results of axisymmetric shell analyses that parametrically assess the structural behavior of SRB field joints subjected to quasi-steady-state internal pressure loading for both the original joint flown on mission STS 51-L and the redesigned joint recently flown on the Space Shuttle Discovery. Discussion of axisymmetric shell modeling issues and details is presented and a generic method for simulating contact between adjacent shells of revolution is described. Results are presented that identify the performance trends of the joints for a wide range of joint parameters.

  13. Finite mixture model: A maximum likelihood estimation approach on time series data

    NASA Astrophysics Data System (ADS)

    Yen, Phoong Seuk; Ismail, Mohd Tahir; Hamzah, Firdaus Mohamad

    2014-09-01

    Recently, statistician emphasized on the fitting of finite mixture model by using maximum likelihood estimation as it provides asymptotic properties. In addition, it shows consistency properties as the sample sizes increases to infinity. This illustrated that maximum likelihood estimation is an unbiased estimator. Moreover, the estimate parameters obtained from the application of maximum likelihood estimation have smallest variance as compared to others statistical method as the sample sizes increases. Thus, maximum likelihood estimation is adopted in this paper to fit the two-component mixture model in order to explore the relationship between rubber price and exchange rate for Malaysia, Thailand, Philippines and Indonesia. Results described that there is a negative effect among rubber price and exchange rate for all selected countries.

  14. Applications of non-standard maximum likelihood techniques in energy and resource economics

    NASA Astrophysics Data System (ADS)

    Moeltner, Klaus

    Two important types of non-standard maximum likelihood techniques, Simulated Maximum Likelihood (SML) and Pseudo-Maximum Likelihood (PML), have only recently found consideration in the applied economic literature. The objective of this thesis is to demonstrate how these methods can be successfully employed in the analysis of energy and resource models. Chapter I focuses on SML. It constitutes the first application of this technique in the field of energy economics. The framework is as follows: Surveys on the cost of power outages to commercial and industrial customers usually capture multiple observations on the dependent variable for a given firm. The resulting pooled data set is censored and exhibits cross-sectional heterogeneity. We propose a model that addresses these issues by allowing regression coefficients to vary randomly across respondents and by using the Geweke-Hajivassiliou-Keane simulator and Halton sequences to estimate high-order cumulative distribution terms. This adjustment requires the use of SML in the estimation process. Our framework allows for a more comprehensive analysis of outage costs than existing models, which rely on the assumptions of parameter constancy and cross-sectional homogeneity. Our results strongly reject both of these restrictions. The central topic of the second Chapter is the use of PML, a robust estimation technique, in count data analysis of visitor demand for a system of recreation sites. PML has been popular with researchers in this context, since it guards against many types of mis-specification errors. We demonstrate, however, that estimation results will generally be biased even if derived through PML if the recreation model is based on aggregate, or zonal data. To countervail this problem, we propose a zonal model of recreation that captures some of the underlying heterogeneity of individual visitors by incorporating distributional information on per-capita income into the aggregate demand function. This adjustment

  15. Joint symbolic dynamic analysis of cardiorespiratory interactions in patients on weaning trials.

    PubMed

    Caminal, P; Giraldo, B; Zabaleta, H; Vallverdu, M; Benito, S; Ballesteros, D; Lopez-Rodriguez, L; Esteban, A; Baumert, M; Voss, A

    2005-01-01

    Assessing autonomic control provides information about patho-physiological imbalances. Measures of variability of the cardiac interbeat duration RR(n) and the variability of the breath duration TTot(n) are sensitive to those changes. The interactions between RR(n) and TTot(n) are complex and strongly non-linear. A study of joint symbolic dynamics is presented as a new short-term non-linear analysis method to investigate these interactions in patients on weaning trials. 78 patients from mechanical ventilation are studied: Group A (patients that failed to maintain spontaneous breathing and were reconnected) and Group B (patients with successful trials). Using the concept of joint symbolic dynamics, cardiac and respiratory changes were transformed into a word series, and the probability of occurrence of each word type was calculated and compared between both groups. Significant differences were found in 13 words, and the most significant pn(Wc010, r010): 0.0041 ± 0.0036 (group A) against 0.0012 ± 0.0024 (group B), p-value = 0.00001. The number of seldom occurring word types (forbidden words) also presents significant differences fwcr: 6.9 ± 6.6 against 13.5 ± 5.3, p-value = 0.00004. Joint symbolic dynamics provides an efficient non-linear representation of cardiorespiratory interactions that offers simple physiological interpretations.

  16. Upper limb joint kinetic analysis during tennis serve: Assessment of competitive level on efficiency and injury risks.

    PubMed

    Martin, C; Bideau, B; Ropars, M; Delamarche, P; Kulpa, R

    2014-08-01

    The aim of this work was to compare the joint kinetics and stroke production efficiency for the shoulder, elbow, and wrist during the serve between professionals and advanced tennis players and to discuss their potential relationship with given overuse injuries. Eleven professional and seven advanced tennis players were studied with an optoelectronic motion analysis system while performing serves. Normalized peak kinetic values of the shoulder, elbow, and wrist joints were calculated using inverse dynamics. To measure serve efficiency, all normalized peak kinetic values were divided by ball velocity. t-tests were used to determine significant differences between the resultant joint kinetics and efficiency values in both groups (advanced vs professional). Shoulder inferior force, shoulder anterior force, shoulder horizontal abduction torque, and elbow medial force were significantly higher in advanced players. Professional players were more efficient than advanced players, as they maximize ball velocity with lower joint kinetics. Since advanced players are subjected to higher joint kinetics, the results suggest that they appeared more susceptible to high risk of shoulder and elbow injuries than professionals, especially during the cocking and deceleration phases of the serve. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  18. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood

    ERIC Educational Resources Information Center

    Karabatsos, George

    2017-01-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon…

  19. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford

    2010-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the twodimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and threedimensional charting gives a visual confirmation of the relationship between causes and their controls

  20. A parametric shell analysis of the shuttle 51-L SRB AFT field joint

    NASA Technical Reports Server (NTRS)

    Davis, Randall C.; Bowman, Lynn M.; Hughes, Robert M., IV; Jackson, Brian J.

    1990-01-01

    Following the Shuttle 51-L accident, an investigation was conducted to determine the cause of the failure. Investigators at the Langley Research Center focused attention on the structural behavior of the field joints with O-ring seals in the steel solid rocket booster (SRB) cases. The shell-of-revolution computer program BOSOR4 was used to model the aft field joint of the solid rocket booster case. The shell model consisted of the SRB wall and joint geometry present during the Shuttle 51-L flight. A parametric study of the joint was performed on the geometry, including joint clearances, contact between the joint components, and on the loads, induced and applied. In addition combinations of geometry and loads were evaluated. The analytical results from the parametric study showed that contact between the joint components was a primary contributor to allowing hot gases to blow by the O-rings. Based upon understanding the original joint behavior, various proposed joint modifications are shown and analyzed in order to provide additional insight and information. Finally, experimental results from a hydro-static pressurization of a test rocket booster case to study joint motion are presented and verified analytically.

  1. New algorithms and methods to estimate maximum-likelihood phylogenies: assessing the performance of PhyML 3.0.

    PubMed

    Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier

    2010-05-01

    PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.

  2. Can symptomatic acromioclavicular joints be differentiated from asymptomatic acromioclavicular joints on 3-T MR imaging?

    PubMed

    Choo, Hye Jung; Lee, Sun Joo; Kim, Jung Han; Cha, Seong Sook; Park, Young Mi; Park, Ji Sung; Lee, Jun Woo; Oh, Minkyung

    2013-04-01

    To evaluate retrospectively whether symptomatic acromioclavicular joints can be differentiated from asymptomatic acromioclavicular joints on 3-T MR imaging. This study included 146 patients who underwent physical examination of acromioclavicular joints and 3-T MR imaging of the shoulder. Among them, 67 patients showing positive results on physical examination were assigned to the symptomatic group, whereas 79 showing negative results were assigned to the asymptomatic group. The following MR findings were compared between the symptomatic and asymptomatic groups: presence of osteophytes, articular surface irregularity, subchondral cysts, acromioclavicular joint fluid, subacromial fluid, subacromial bony spurs, joint capsular distension, bone edema, intraarticular enhancement, periarticular enhancement, superior and inferior joint capsular distension degree, and joint capsular thickness. The patients were subsequently divided into groups based on age (younger, older) and the method of MR arthrography (direct MR arthrography, indirect MR arthrography), and all the MR findings in each subgroup were reanalyzed. The meaningful cutoff value of each significant continuous variable was calculated using receiver operating characteristic analysis. The degree of superior capsular distension was the only significant MR finding of symptomatic acromioclavicular joints and its meaningful cutoff value was 2.1mm. After subgroup analyses, this variable was significant in the older age group and indirect MR arthrography group. On 3-T MR imaging, the degree of superior joint capsular distension might be a predictable MR finding in the diagnosis of symptomatic acromioclavicular joints. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  3. Predicting likelihood of seeking help through the employee assistance program among salaried and union hourly employees.

    PubMed

    Delaney, W; Grube, J W; Ames, G M

    1998-03-01

    This research investigated belief, social support and background predictors of employee likelihood to use an Employee Assistance Program (EAP) for a drinking problem. An anonymous cross-sectional survey was administered in the home. Bivariate analyses and simultaneous equations path analysis were used to explore a model of EAP use. Survey and ethnographic research were conducted in a unionized heavy machinery manufacturing plant in the central states of the United States. A random sample of 852 hourly and salaried employees was selected. In addition to background variables, measures included: likelihood of going to an EAP for a drinking problem, belief the EAP can help, social support for the EAP from co-workers/others, belief that EAP use will harm employment, and supervisor encourages the EAP for potential drinking problems. Belief in EAP efficacy directly increased the likelihood of going to an EAP. Greater perceived social support and supervisor encouragement increased the likelihood of going to an EAP both directly and indirectly through perceived EAP efficacy. Black and union hourly employees were more likely to say they would use an EAP. Males and those who reported drinking during working hours were less likely to say they would use an EAP for a drinking problem. EAP beliefs and social support have significant effects on likelihood to go to an EAP for a drinking problem. EAPs may wish to focus their efforts on creating an environment where there is social support from coworkers and encouragement from supervisors for using EAP services. Union networks and team members have an important role to play in addition to conventional supervisor intervention.

  4. MIXOR: a computer program for mixed-effects ordinal regression analysis.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-03-01

    MIXOR provides maximum marginal likelihood estimates for mixed-effects ordinal probit, logistic, and complementary log-log regression models. These models can be used for analysis of dichotomous and ordinal outcomes from either a clustered or longitudinal design. For clustered data, the mixed-effects model assumes that data within clusters are dependent. The degree of dependency is jointly estimated with the usual model parameters, thus adjusting for dependence resulting from clustering of the data. Similarly, for longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time, and can estimate the degree to which these time-related effects vary in the population of individuals. MIXOR uses marginal maximum likelihood estimation, utilizing a Fisher-scoring solution. For the scoring solution, the Cholesky factor of the random-effects variance-covariance matrix is estimated, along with the effects of model covariates. Examples illustrating usage and features of MIXOR are provided.

  5. Efficient Bit-to-Symbol Likelihood Mappings

    NASA Technical Reports Server (NTRS)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  6. Copula Regression Analysis of Simultaneously Recorded Frontal Eye Field and Inferotemporal Spiking Activity during Object-Based Working Memory

    PubMed Central

    Hu, Meng; Clark, Kelsey L.; Gong, Xiajing; Noudoost, Behrad; Li, Mingyao; Moore, Tirin

    2015-01-01

    Inferotemporal (IT) neurons are known to exhibit persistent, stimulus-selective activity during the delay period of object-based working memory tasks. Frontal eye field (FEF) neurons show robust, spatially selective delay period activity during memory-guided saccade tasks. We present a copula regression paradigm to examine neural interaction of these two types of signals between areas IT and FEF of the monkey during a working memory task. This paradigm is based on copula models that can account for both marginal distribution over spiking activity of individual neurons within each area and joint distribution over ensemble activity of neurons between areas. Considering the popular GLMs as marginal models, we developed a general and flexible likelihood framework that uses the copula to integrate separate GLMs into a joint regression analysis. Such joint analysis essentially leads to a multivariate analog of the marginal GLM theory and hence efficient model estimation. In addition, we show that Granger causality between spike trains can be readily assessed via the likelihood ratio statistic. The performance of this method is validated by extensive simulations, and compared favorably to the widely used GLMs. When applied to spiking activity of simultaneously recorded FEF and IT neurons during working memory task, we observed significant Granger causality influence from FEF to IT, but not in the opposite direction, suggesting the role of the FEF in the selection and retention of visual information during working memory. The copula model has the potential to provide unique neurophysiological insights about network properties of the brain. PMID:26063909

  7. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical

  8. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature

  9. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  10. Comparative biomechanical analysis of current microprocessor-controlled prosthetic knee joints.

    PubMed

    Bellmann, Malte; Schmalz, Thomas; Blumentritt, Siegmar

    2010-04-01

    To investigate and identify functional differences of 4 microprocessor-controlled prosthetic knee joints (C-Leg, Hybrid Knee [also called Energy Knee], Rheo Knee, Adaptive 2). Tested situations were walking on level ground, on stairs and ramps; additionally, the fall prevention potentials for each design were examined. The measuring technology used included an optoelectronic camera system combined with 2 forceplates as well as a mobile spiroergometric system. The study was conducted in a gait laboratory. Subjects with unilateral transfemoral amputations (N=9; mobility grade, 3-4; age, 22-49y) were tested. Participants were fitted and tested with 4 different microprocessor-controlled knee joints. Static prosthetic alignment, time distance parameters, kinematic and kinetic data and metabolic energy consumption. Compared with the Hybrid Knee and the Adaptive 2, the C-Leg offers clear advantages in the provision of adequate swing phase flexion resistances and terminal extension damping during level walking at various speeds, especially at higher walking speeds. The Rheo Knee provides sufficient terminal extension; however, swing phase flexion resistances seem to be too low. The values for metabolic energy consumption show only slight differences during level walking. The joint resistances generated for descending stairs and ramps relieve the contralateral side to varying degrees. When walking on stairs, safety-relevant technical differences between the investigated joint types can be observed. Designs with adequate internal resistances offer stability advantages when the foot is positioned on the step. Stumble recovery tests reveal that the different knee joint designs vary in their effectiveness in preventing the patient from falling. The patient benefits provided by the investigated electronic prosthetic knee joints differ considerably. The C-Leg appears to offer the amputee greater functional and safety-related advantages than the other tested knee joints. Reduced

  11. Concept for a fast analysis method of the energy dissipation at mechanical joints

    NASA Astrophysics Data System (ADS)

    Wolf, Alexander; Brosius, Alexander

    2017-10-01

    When designing hybrid parts and structures one major challenge is the design, production and quality assessment of the joining points. While the polymeric composites themselves have excellent material properties, the necessary joints are often the weak link in assembled structures. This paper presents a method of measuring and analysing the energy dissipation at mechanical joining points of hybrid parts. A simplified model is applied based on the characteristic response to different excitation frequencies and amplitudes. The dissipation from damage is the result of relative moments between joining partners und damaged fibres within the composite, whereas the visco-elastic material behaviour causes the intrinsic dissipation. The ambition is to transfer these research findings to the characterisation of mechanical joints in order to quickly assess the general quality of the joint with this non-destructive testing method. The inherent challenge for realising this method is the correct interpretation of the measured energy dissipation and its attribution to either a bad joining point or intrinsic material properties. In this paper the authors present the concept for energy dissipation measurements at different joining points. By inverse analysis a simplified fast semi-analytical model will be developed that allows for a quick basic quality assessment of a given joining point.

  12. The recursive maximum likelihood proportion estimator: User's guide and test results

    NASA Technical Reports Server (NTRS)

    Vanrooy, D. L.

    1976-01-01

    Implementation of the recursive maximum likelihood proportion estimator is described. A user's guide to programs as they currently exist on the IBM 360/67 at LARS, Purdue is included, and test results on LANDSAT data are described. On Hill County data, the algorithm yields results comparable to the standard maximum likelihood proportion estimator.

  13. Clinical Paresthesia Atlas Illustrates Likelihood of Coverage Based on Spinal Cord Stimulator Electrode Location.

    PubMed

    Taghva, Alexander; Karst, Edward; Underwood, Paul

    2017-08-01

    Concordant paresthesia coverage is an independent predictor of pain relief following spinal cord stimulation (SCS). Using aggregate data, our objective is to produce a map of paresthesia coverage as a function of electrode location in SCS. This retrospective analysis used x-rays, SCS programming data, and paresthesia coverage maps from the EMPOWER registry of SCS implants for chronic neuropathic pain. Spinal level of dorsal column stimulation was determined by x-ray adjudication and active cathodes in patient programs. Likelihood of paresthesia coverage was determined as a function of stimulating electrode location. Segments of paresthesia coverage were grouped anatomically. Fisher's exact test was used to identify significant differences in likelihood of paresthesia coverage as a function of spinal stimulation level. In the 178 patients analyzed, the most prevalent areas of paresthesia coverage were buttocks, anterior and posterior thigh (each 98%), and low back (94%). Unwanted paresthesia at the ribs occurred in 8% of patients. There were significant differences in the likelihood of achieving paresthesia, with higher thoracic levels (T5, T6, and T7) more likely to achieve low back coverage but also more likely to introduce paresthesia felt at the ribs. Higher levels in the thoracic spine were associated with greater coverage of the buttocks, back, and thigh, and with lesser coverage of the leg and foot. This paresthesia atlas uses real-world, aggregate data to determine likelihood of paresthesia coverage as a function of stimulating electrode location. It represents an application of "big data" techniques, and a step toward achieving personalized SCS therapy tailored to the individual's chronic pain. © 2017 International Neuromodulation Society.

  14. Knee Joint Vibration Signal Analysis with Matching Pursuit Decomposition and Dynamic Weighted Classifier Fusion

    PubMed Central

    Cai, Suxian; Yang, Shanshan; Zheng, Fang; Lu, Meng; Wu, Yunfeng; Krishnan, Sridhar

    2013-01-01

    Analysis of knee joint vibration (VAG) signals can provide quantitative indices for detection of knee joint pathology at an early stage. In addition to the statistical features developed in the related previous studies, we extracted two separable features, that is, the number of atoms derived from the wavelet matching pursuit decomposition and the number of significant signal turns detected with the fixed threshold in the time domain. To perform a better classification over the data set of 89 VAG signals, we applied a novel classifier fusion system based on the dynamic weighted fusion (DWF) method to ameliorate the classification performance. For comparison, a single leastsquares support vector machine (LS-SVM) and the Bagging ensemble were used for the classification task as well. The results in terms of overall accuracy in percentage and area under the receiver operating characteristic curve obtained with the DWF-based classifier fusion method reached 88.76% and 0.9515, respectively, which demonstrated the effectiveness and superiority of the DWF method with two distinct features for the VAG signal analysis. PMID:23573175

  15. Loading Analysis of Composite Wind Turbine Blade for Fatigue Life Prediction of Adhesively Bonded Root Joint

    NASA Astrophysics Data System (ADS)

    Salimi-Majd, Davood; Azimzadeh, Vahid; Mohammadi, Bijan

    2015-06-01

    Nowadays wind energy is widely used as a non-polluting cost-effective renewable energy resource. During the lifetime of a composite wind turbine which is about 20 years, the rotor blades are subjected to different cyclic loads such as aerodynamics, centrifugal and gravitational forces. These loading conditions, cause to fatigue failure of the blade at the adhesively bonded root joint, where the highest bending moments will occur and consequently, is the most critical zone of the blade. So it is important to estimate the fatigue life of the root joint. The cohesive zone model is one of the best methods for prediction of initiation and propagation of debonding at the root joint. The advantage of this method is the possibility of modeling the debonding without any requirement to the remeshing. However in order to use this approach, it is necessary to analyze the cyclic loading condition at the root joint. For this purpose after implementing a cohesive interface element in the Ansys finite element software, one blade of a horizontal axis wind turbine with 46 m rotor diameter was modelled in full scale. Then after applying loads on the blade under different condition of the blade in a full rotation, the critical condition of the blade is obtained based on the delamination index and also the load ratio on the root joint in fatigue cycles is calculated. These data are the inputs for fatigue damage growth analysis of the root joint by using CZM approach that will be investigated in future work.

  16. Joint Clustering and Component Analysis of Correspondenceless Point Sets: Application to Cardiac Statistical Modeling.

    PubMed

    Gooya, Ali; Lekadir, Karim; Alba, Xenia; Swift, Andrew J; Wild, Jim M; Frangi, Alejandro F

    2015-01-01

    Construction of Statistical Shape Models (SSMs) from arbitrary point sets is a challenging problem due to significant shape variation and lack of explicit point correspondence across the training data set. In medical imaging, point sets can generally represent different shape classes that span healthy and pathological exemplars. In such cases, the constructed SSM may not generalize well, largely because the probability density function (pdf) of the point sets deviates from the underlying assumption of Gaussian statistics. To this end, we propose a generative model for unsupervised learning of the pdf of point sets as a mixture of distinctive classes. A Variational Bayesian (VB) method is proposed for making joint inferences on the labels of point sets, and the principal modes of variations in each cluster. The method provides a flexible framework to handle point sets with no explicit point-to-point correspondences. We also show that by maximizing the marginalized likelihood of the model, the optimal number of clusters of point sets can be determined. We illustrate this work in the context of understanding the anatomical phenotype of the left and right ventricles in heart. To this end, we use a database containing hearts of healthy subjects, patients with Pulmonary Hypertension (PH), and patients with Hypertrophic Cardiomyopathy (HCM). We demonstrate that our method can outperform traditional PCA in both generalization and specificity measures.

  17. Brief communication: Cineradiographic analysis of the chimpanzee (Pan troglodytes) talonavicular and calcaneocuboid joints.

    PubMed

    Thompson, Nathan E; Holowka, Nicholas B; O'Neill, Matthew C; Larson, Susan G

    2014-08-01

    During terrestrial locomotion, chimpanzees exhibit dorsiflexion of the midfoot between midstance and toe-off of stance phase, a phenomenon that has been called the "midtarsal break." This motion is generally absent during human bipedalism, and in chimpanzees is associated with more mobile foot joints than in humans. However, the contribution of individual foot joints to overall foot mobility in chimpanzees is poorly understood, particularly on the medial side of the foot. The talonavicular (TN) and calcaneocuboid (CC) joints have both been suggested to contribute significantly to midfoot mobility and to the midtarsal break in chimpanzees. To evaluate the relative magnitude of motion that can occur at these joints, we tracked skeletal motion of the hindfoot and midfoot during passive plantarflexion and dorsiflexion manipulations using cineradiography. The sagittal plane range of motion was 38 ± 10° at the TN joint and 14 ± 8° at the CC joint. This finding indicates that the TN joint is more mobile than the CC joint during ankle plantarflexion-dorsiflexion. We suggest that the larger range of motion at the TN joint during dorsiflexion is associated with a rotation (inversion-eversion) across the transverse tarsal joint, which may occur in addition to sagittal plane motion. © 2014 Wiley Periodicals, Inc.

  18. Cost analysis of debridement and retention for management of prosthetic joint infection.

    PubMed

    Peel, T N; Dowsey, M M; Buising, K L; Liew, D; Choong, P F M

    2013-02-01

    Prosthetic joint infection remains one of the most devastating complications of arthroplasty. Debridement and retention of the prosthesis is an attractive management option in carefully selected patients. Despite this, there are no data investigating the cost of this management modality for prosthetic joint infections. The aim of this case-control study was to calculate the cost associated with debridement and retention for management of prosthetic joint infection compared with primary joint replacement surgery without prosthetic joint infection. From 1 January 2008 to 30 June 2010, there were 21 prosthetic joint infections matched to 42 control patients. Controls were matched to cases according to the arthroplasty site, age and sex. Cases had a greater number of unplanned readmissions (100% vs. 7.1%; p <0.001), more additional surgery (3.3 vs. 0.07; p <0.001) and longer total bed days (31.6 vs. 7.9 days; p <0.001). In addition they had more inpatient, outpatient and emergency department visits (p <0.001, respectively). For patients with prosthetic joint infection the total cost, including index operation and costs of management of the prosthetic joint infection, was 3.1 times the cost of primary arthoplasty; the mean cost for cases was Australian dollars (AUD) $69,414 (±29,869) compared with $22,085 (±8147) (p <0.001). The demand for arthroplasty continues to grow and with that, the number of prosthetic joint infections will also increase, placing significant burden on the health system. Our study adds significantly to the growing body of evidence highlighting the substantial costs associated with prosthetic joint infection. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.

  19. Research on adaptive optics image restoration algorithm based on improved joint maximum a posteriori method

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Li, Yang; Wang, Junnan; Liu, Ying

    2018-03-01

    In this paper, we propose a point spread function (PSF) reconstruction method and joint maximum a posteriori (JMAP) estimation method for the adaptive optics image restoration. Using the JMAP method as the basic principle, we establish the joint log likelihood function of multi-frame adaptive optics (AO) images based on the image Gaussian noise models. To begin with, combining the observed conditions and AO system characteristics, a predicted PSF model for the wavefront phase effect is developed; then, we build up iterative solution formulas of the AO image based on our proposed algorithm, addressing the implementation process of multi-frame AO images joint deconvolution method. We conduct a series of experiments on simulated and real degraded AO images to evaluate our proposed algorithm. Compared with the Wiener iterative blind deconvolution (Wiener-IBD) algorithm and Richardson-Lucy IBD algorithm, our algorithm has better restoration effects including higher peak signal-to-noise ratio ( PSNR) and Laplacian sum ( LS) value than the others. The research results have a certain application values for actual AO image restoration.

  20. Analysis of crackling noise using the maximum-likelihood method: Power-law mixing and exponential damping.

    PubMed

    Salje, Ekhard K H; Planes, Antoni; Vives, Eduard

    2017-10-01

    Crackling noise can be initiated by competing or coexisting mechanisms. These mechanisms can combine to generate an approximate scale invariant distribution that contains two or more contributions. The overall distribution function can be analyzed, to a good approximation, using maximum-likelihood methods and assuming that it follows a power law although with nonuniversal exponents depending on a varying lower cutoff. We propose that such distributions are rather common and originate from a simple superposition of crackling noise distributions or exponential damping.

  1. Susceptibility, likelihood to be diagnosed, worry and fear for contracting Lyme disease.

    PubMed

    Fogel, Joshua; Chawla, Gurasees S

    Risk perception and psychological concerns are relevant for understanding how people view Lyme disease. This study investigates the four separate outcomes of susceptibility, likelihood to be diagnosed, worry, and fear for contracting Lyme disease. University students (n=713) were surveyed about demographics, perceived health, Lyme disease knowledge, Lyme disease preventive behaviors, Lyme disease history, and Lyme disease miscellaneous variables. We found that women were associated with increased susceptibility and fear. Asian/Asian-American race/ethnicity was associated with increased worry and fear. Perceived good health was associated with increased likelihood to be diagnosed, worry, and fear. Correct knowledge was associated with increased susceptibility and likelihood to be diagnosed. Those who typically spend a lot of time outdoors were associated with increased susceptibility, likelihood to be diagnosed, worry, and fear. In conclusion, healthcare providers and public health campaigns should address susceptibility, likelihood to be diagnosed, worry, and fear about Lyme disease, and should particularly target women and Asians/Asian-Americans to address any possible misconceptions and/or offer effective coping strategies. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  2. Global-Local Finite Element Analysis of Bonded Single-Lap Joints

    NASA Technical Reports Server (NTRS)

    Kilic, Bahattin; Madenci, Erdogan; Ambur, Damodar R.

    2004-01-01

    Adhesively bonded lap joints involve dissimilar material junctions and sharp changes in geometry, possibly leading to premature failure. Although the finite element method is well suited to model the bonded lap joints, traditional finite elements are incapable of correctly resolving the stress state at junctions of dissimilar materials because of the unbounded nature of the stresses. In order to facilitate the use of bonded lap joints in future structures, this study presents a finite element technique utilizing a global (special) element coupled with traditional elements. The global element includes the singular behavior at the junction of dissimilar materials with or without traction-free surfaces.

  3. Analysis of the Lenticular Jointed MARSIS Antenna Deployment

    NASA Technical Reports Server (NTRS)

    Mobrem, Mehran; Adams, Douglas S.

    2006-01-01

    This paper summarizes important milestones in a yearlong comprehensive effort which culminated in successful deployments of the MARSIS antenna booms in May and June of 2005. Experimentally measured straight section and hinge properties are incorporated into specialized modeling techniques that are used to simulate the boom lenticular joints. System level models are exercised to understand the boom deployment dynamics and spacecraft level implications. Discussion includes a comparison of ADAMS simulation results to measured flight data taken during the three boom deployments. Important parameters that govern lenticular joint behavior are outlined and a short summary of lessons learned and recommendations is included to better understand future applications of this technology.

  4. New applications of maximum likelihood and Bayesian statistics in macromolecular crystallography.

    PubMed

    McCoy, Airlie J

    2002-10-01

    Maximum likelihood methods are well known to macromolecular crystallographers as the methods of choice for isomorphous phasing and structure refinement. Recently, the use of maximum likelihood and Bayesian statistics has extended to the areas of molecular replacement and density modification, placing these methods on a stronger statistical foundation and making them more accurate and effective.

  5. Principal Component Analysis in Construction of 3D Human Knee Joint Models Using a Statistical Shape Model Method

    PubMed Central

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2013-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the 3D joint surface model has been reported in literature. In this study, we constructed a SSM database using 152 human CT knee joint models, including the femur, tibia and patella and analyzed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 seconds using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus it may have a broad application in computer assisted knee surgeries that require 3D surface models of the knee. PMID:24156375

  6. Parallel implementation of D-Phylo algorithm for maximum likelihood clusters.

    PubMed

    Malik, Shamita; Sharma, Dolly; Khatri, Sunil Kumar

    2017-03-01

    This study explains a newly developed parallel algorithm for phylogenetic analysis of DNA sequences. The newly designed D-Phylo is a more advanced algorithm for phylogenetic analysis using maximum likelihood approach. The D-Phylo while misusing the seeking capacity of k -means keeps away from its real constraint of getting stuck at privately conserved motifs. The authors have tested the behaviour of D-Phylo on Amazon Linux Amazon Machine Image(Hardware Virtual Machine)i2.4xlarge, six central processing unit, 122 GiB memory, 8  ×  800 Solid-state drive Elastic Block Store volume, high network performance up to 15 processors for several real-life datasets. Distributing the clusters evenly on all the processors provides us the capacity to accomplish a near direct speed if there should arise an occurrence of huge number of processors.

  7. Seal Joint Analysis and Design for the Ares-I Upper Stage LOX Tank

    NASA Technical Reports Server (NTRS)

    Phillips, Dawn R.; Wingate, Robert J.

    2011-01-01

    The sealing capability of the Ares-I Upper Stage liquid oxygen tank-to-sump joint is assessed by analyzing the deflections of the joint components. Analyses are performed using three-dimensional symmetric wedge finite element models and the ABAQUS commercial finite element software. For the pressure loads and feedline interface loads, the analyses employ a mixed factor of safety approach to comply with the Constellation Program factor of safety requirements. Naflex pressure-assisted seals are considered first because they have been used successfully in similar seal joints in the Space Shuttle External Tank. For the baseline sump seal joint configuration with a Naflex seal, the predicted joint opening greatly exceeds the seal design specification. Three redesign options of the joint that maintain the use of a Naflex seal are studied. The joint openings for the redesigned seal joints show improvement over the baseline configuration; however, these joint openings still exceed the seal design specification. RACO pressure-assisted seals are considered next because they are known to also be used on the Space Shuttle External Tank, and the joint opening allowable is much larger than the specification for the Naflex seals. The finite element models for the RACO seal analyses are created by modifying the models that were used for the Naflex seal analyses. The analyses show that the RACO seal may provide sufficient sealing capability for the sump seal joint. The results provide reasonable data to recommend the design change and plan a testing program to determine the capability of RACO seals in the Ares-I Upper Stage liquid oxygen tank sump seal joint.

  8. Hybrid approach combining chemometrics and likelihood ratio framework for reporting the evidential value of spectra.

    PubMed

    Martyna, Agnieszka; Zadora, Grzegorz; Neocleous, Tereza; Michalska, Aleksandra; Dean, Nema

    2016-08-10

    Many chemometric tools are invaluable and have proven effective in data mining and substantial dimensionality reduction of highly multivariate data. This becomes vital for interpreting various physicochemical data due to rapid development of advanced analytical techniques, delivering much information in a single measurement run. This concerns especially spectra, which are frequently used as the subject of comparative analysis in e.g. forensic sciences. In the presented study the microtraces collected from the scenarios of hit-and-run accidents were analysed. Plastic containers and automotive plastics (e.g. bumpers, headlamp lenses) were subjected to Fourier transform infrared spectrometry and car paints were analysed using Raman spectroscopy. In the forensic context analytical results must be interpreted and reported according to the standards of the interpretation schemes acknowledged in forensic sciences using the likelihood ratio approach. However, for proper construction of LR models for highly multivariate data, such as spectra, chemometric tools must be employed for substantial data compression. Conversion from classical feature representation to distance representation was proposed for revealing hidden data peculiarities and linear discriminant analysis was further applied for minimising the within-sample variability while maximising the between-sample variability. Both techniques enabled substantial reduction of data dimensionality. Univariate and multivariate likelihood ratio models were proposed for such data. It was shown that the combination of chemometric tools and the likelihood ratio approach is capable of solving the comparison problem of highly multivariate and correlated data after proper extraction of the most relevant features and variance information hidden in the data structure. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Dark matter constraints from a joint analysis of dwarf Spheroidal galaxy observations with VERITAS

    DOE PAGES

    Archambault, S.; Archer, A.; Benbow, W.; ...

    2017-04-05

    We present constraints on the annihilation cross section of weakly interacting massive particles dark matter based on the joint statistical analysis of four dwarf galaxies with VERITAS. These results are derived from an optimized photon weighting statistical technique that improves on standard imaging atmospheric Cherenkov telescope (IACT) analyses by utilizing the spectral and spatial properties of individual photon events.

  10. Unloading joints to treat osteoarthritis, including joint distraction.

    PubMed

    Lafeber, Floris P J G; Intema, Femke; Van Roermund, Peter M; Marijnissen, Anne C A

    2006-09-01

    Patients are increasingly becoming interested in nonpharmacologic approaches to manage their osteoarthritis. This review examines the recent literature on the potential beneficial effects of unloading joints in the treatment of osteoarthritis, with a focus on joint distraction. Mechanical factors are involved in the development and progression of osteoarthritis. If "loading" is a major cause in development and progression of osteoarthritis, then "unloading" may be able to prevent progression. There is evidence that unloading may be effective in reducing pain and slowing down structural damage. This review describes unloading by footwear and bracing (nonsurgical), unloading by osteotomy (surgical), and has a focus on unloading by joint distraction. Excellent reviews in all these three fields have been published over the past few years. Recent studies argue for the usefulness of a biomechanical approach to improve function and possibly reduce disease progression in osteoarthritis. To improve patient function and possibly reduce disease progression, a biomechanical approach should be considered in treating patients with osteoarthritis. Further research (appropriate high-quality clinical trials) and analysis (clinical as well as preclinical and fundamental) are still necessary, however, to understand, validate, and refine the different approaches of unloading to treat osteoarthritis.

  11. Buckling of a Longitudinally Jointed Curved Composite Panel Arc Segment for Next Generation of Composite Heavy Lift Launch Vehicles: Verification Testing Analysis

    NASA Technical Reports Server (NTRS)

    Farrokh, Babak; Segal, Kenneth N.; Akkerman, Michael; Glenn, Ronald L.; Rodini, Benjamin T.; Fan, Wei-Ming; Kellas, Sortiris; Pineda, Evan J.

    2014-01-01

    In this work, an all-bonded out-of-autoclave (OoA) curved longitudinal composite joint concept, intended for use in the next generation of composite heavy lift launch vehicles, was evaluated and verified through finite element (FE) analysis, fabrication, testing, and post-test inspection. The joint was used to connect two curved, segmented, honeycomb sandwich panels representative of a Space Launch System (SLS) fairing design. The overall size of the resultant panel was 1.37 m by 0.74 m (54 in by 29 in), of which the joint comprised a 10.2 cm (4 in) wide longitudinal strip at the center. NASTRAN and ABAQUS were used to perform linear and non-linear analyses of the buckling and strength performance of the jointed panel. Geometric non-uniformities (i.e., surface contour imperfections) were measured and incorporated into the FE model and analysis. In addition, a sensitivity study of the specimens end condition showed that bonding face-sheet doublers to the panel's end, coupled with some stress relief features at corner-edges, can significantly reduce the stress concentrations near the load application points. Ultimately, the jointed panel was subjected to a compressive load. Load application was interrupted at the onset of buckling (at 356 kN 80 kips). A post-test non-destructive evaluation (NDE) showed that, as designed, buckling occurred without introducing any damage into the panel or the joint. The jointed panel was further capable of tolerating an impact damage to the same buckling load with no evidence of damage propagation. The OoA cured all-composite joint shows promise as a low mass factory joint for segmented barrels.

  12. Adaptive control of center of mass (global) motion and its joint (local) origin in gait.

    PubMed

    Yang, Feng; Pai, Yi-Chung

    2014-08-22

    Dynamic gait stability can be quantified by the relationship of the motion state (i.e. the position and velocity) between the body center of mass (COM) and its base of support (BOS). Humans learn how to adaptively control stability by regulating the absolute COM motion state (i.e. its position and velocity) and/or by controlling the BOS (through stepping) in a predictable manner, or by doing both simultaneously following an external perturbation that disrupts their regular relationship. Post repeated-slip perturbation training, for instance, older adults learned to forward shift their COM position while walking with a reduced step length, hence reduced their likelihood of slip-induced falls. How and to what extent each individual joint influences such adaptive alterations is mostly unknown. A three-dimensional individualized human kinematic model was established. Based on the human model, sensitivity analysis was used to systematically quantify the influence of each lower limb joint on the COM position relative to the BOS and the step length during gait. It was found that the leading foot had the greatest effect on regulating the COM position relative to the BOS; and both hips bear the most influence on the step length. These findings could guide cost-effective but efficient fall-reduction training paradigm among older population. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Secretome analysis of human articular chondrocytes unravels catabolic effects of nicotine on the joint.

    PubMed

    Lourido, Lucía; Calamia, Valentina; Fernández-Puente, Patricia; Mateos, Jesús; Oreiro, Natividad; Blanco, Francisco J; Ruiz-Romero, Cristina

    2016-06-01

    Osteoarthritis (OA) is a degenerative joint pathology characterized by articular cartilage degradation that lacks from efficient therapy. Since previous epidemiological data show a high controversy regarding the role of smoking in OA, we aimed to evaluate the effects of nicotine (the most physiologically active compound of tobacco) on the joint. Secretome analyses, based on metabolic labeling followed by LC-MALDI-TOF/TOF analysis, were carried out using an in vitro model of articular inflammation (primary human articular chondrocytes treated with interleukin-1β), and also on osteoarthritic cells. ELISA and Western blot assays were performed to verify some of the results. Nineteen proteins were altered by nicotine in the model of articular inflammation, including several cytokines and proteases. We confirmed the increased secretion by nicotine of matrix metalloproteinase 1 and two proposed markers of OA, fibronectin, and chitinase 3-like protein 1. Finally, four components of the extracellular matrix of cartilage were decreased by nicotine in OA chondrocytes. Our data contribute to a better understanding of the molecular mechanisms that are modulated by nicotine in cartilage cells, suggesting a negative effect of this drug on the joint. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Static analysis of large-scale multibody system using joint coordinates and spatial algebra operator.

    PubMed

    Omar, Mohamed A

    2014-01-01

    Initial transient oscillations inhibited in the dynamic simulations responses of multibody systems can lead to inaccurate results, unrealistic load prediction, or simulation failure. These transients could result from incompatible initial conditions, initial constraints violation, and inadequate kinematic assembly. Performing static equilibrium analysis before the dynamic simulation can eliminate these transients and lead to stable simulation. Most exiting multibody formulations determine the static equilibrium position by minimizing the system potential energy. This paper presents a new general purpose approach for solving the static equilibrium in large-scale articulated multibody. The proposed approach introduces an energy drainage mechanism based on Baumgarte constraint stabilization approach to determine the static equilibrium position. The spatial algebra operator is used to express the kinematic and dynamic equations of the closed-loop multibody system. The proposed multibody system formulation utilizes the joint coordinates and modal elastic coordinates as the system generalized coordinates. The recursive nonlinear equations of motion are formulated using the Cartesian coordinates and the joint coordinates to form an augmented set of differential algebraic equations. Then system connectivity matrix is derived from the system topological relations and used to project the Cartesian quantities into the joint subspace leading to minimum set of differential equations.

  15. Bolted joints in graphite-epoxy composites

    NASA Technical Reports Server (NTRS)

    Hart-Smith, L. J.

    1976-01-01

    All-graphite/epoxy laminates and hybrid graphite-glass/epoxy laminates were tested. The tests encompassed a range of geometries for each laminate pattern to cover the three basic failure modes - net section tension failure through the bolt hole, bearing and shearout. Static tensile and compressive loads were applied. A constant bolt diameter of 6.35 mm (0.25 in.) was used in the tests. The interaction of stress concentrations associated with multi-row bolted joints was investigated by testing single- and double-row bolted joints and open-hole specimens in tension. For tension loading, linear interaction was found to exist between the bearing stress reacted at a given bolt hole and the remaining tension stress running by that hole to be reacted elsewhere. The interaction under compressive loading was found to be non-linear. Comparative tests were run using single-lap bolted joints and double-lap joints with pin connection. Both of these joint types exhibited lower strengths than were demonstrated by the corresponding double-lap joints. The analysis methods developed here for single bolt joints are shown to be capable of predicting the behavior of multi-row joints.

  16. Maximum-likelihood block detection of noncoherent continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K.; Divsalar, Dariush

    1993-01-01

    This paper examines maximum-likelihood block detection of uncoded full response CPM over an additive white Gaussian noise (AWGN) channel. Both the maximum-likelihood metrics and the bit error probability performances of the associated detection algorithms are considered. The special and popular case of minimum-shift-keying (MSK) corresponding to h = 0.5 and constant amplitude frequency pulse is treated separately. The many new receiver structures that result from this investigation can be compared to the traditional ones that have been used in the past both from the standpoint of simplicity of implementation and optimality of performance.

  17. Source and Message Factors in Persuasion: A Reply to Stiff's Critique of the Elaboration Likelihood Model.

    ERIC Educational Resources Information Center

    Petty, Richard E.; And Others

    1987-01-01

    Answers James Stiff's criticism of the Elaboration Likelihood Model (ELM) of persuasion. Corrects certain misperceptions of the ELM and criticizes Stiff's meta-analysis that compares ELM predictions with those derived from Kahneman's elastic capacity model. Argues that Stiff's presentation of the ELM and the conclusions he draws based on the data…

  18. A multi-valued neutrosophic qualitative flexible approach based on likelihood for multi-criteria decision-making problems

    NASA Astrophysics Data System (ADS)

    Peng, Juan-juan; Wang, Jian-qiang; Yang, Wu-E.

    2017-01-01

    In this paper, multi-criteria decision-making (MCDM) problems based on the qualitative flexible multiple criteria method (QUALIFLEX), in which the criteria values are expressed by multi-valued neutrosophic information, are investigated. First, multi-valued neutrosophic sets (MVNSs), which allow the truth-membership function, indeterminacy-membership function and falsity-membership function to have a set of crisp values between zero and one, are introduced. Then the likelihood of multi-valued neutrosophic number (MVNN) preference relations is defined and the corresponding properties are also discussed. Finally, an extended QUALIFLEX approach based on likelihood is explored to solve MCDM problems where the assessments of alternatives are in the form of MVNNs; furthermore an example is provided to illustrate the application of the proposed method, together with a comparison analysis.

  19. Mechanical Influences on Morphogenesis of the Knee Joint Revealed through Morphological, Molecular and Computational Analysis of Immobilised Embryos

    PubMed Central

    Roddy, Karen A.; Prendergast, Patrick J.; Murphy, Paula

    2011-01-01

    Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint. PMID:21386908

  20. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    .g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.

  1. Maximum likelihood estimates, from censored data, for mixed-Weibull distributions

    NASA Astrophysics Data System (ADS)

    Jiang, Siyuan; Kececioglu, Dimitri

    1992-06-01

    A new algorithm for estimating the parameters of mixed-Weibull distributions from censored data is presented. The algorithm follows the principle of maximum likelihood estimate (MLE) through the expectation and maximization (EM) algorithm, and it is derived for both postmortem and nonpostmortem time-to-failure data. It is concluded that the concept of the EM algorithm is easy to understand and apply (only elementary statistics and calculus are required). The log-likelihood function cannot decrease after an EM sequence; this important feature was observed in all of the numerical calculations. The MLEs of the nonpostmortem data were obtained successfully for mixed-Weibull distributions with up to 14 parameters in a 5-subpopulation, mixed-Weibull distribution. Numerical examples indicate that some of the log-likelihood functions of the mixed-Weibull distributions have multiple local maxima; therefore, the algorithm should start at several initial guesses of the parameter set.

  2. A meta-analysis of neuroimaging studies on divergent thinking using activation likelihood estimation.

    PubMed

    Wu, Xin; Yang, Wenjing; Tong, Dandan; Sun, Jiangzhou; Chen, Qunlin; Wei, Dongtao; Zhang, Qinglin; Zhang, Meng; Qiu, Jiang

    2015-07-01

    In this study, an activation likelihood estimation (ALE) meta-analysis was used to conduct a quantitative investigation of neuroimaging studies on divergent thinking. Based on the ALE results, the functional magnetic resonance imaging (fMRI) studies showed that distributed brain regions were more active under divergent thinking tasks (DTTs) than those under control tasks, but a large portion of the brain regions were deactivated. The ALE results indicated that the brain networks of the creative idea generation in DTTs may be composed of the lateral prefrontal cortex, posterior parietal cortex [such as the inferior parietal lobule (BA 40) and precuneus (BA 7)], anterior cingulate cortex (ACC) (BA 32), and several regions in the temporal cortex [such as the left middle temporal gyrus (BA 39), and left fusiform gyrus (BA 37)]. The left dorsolateral prefrontal cortex (BA 46) was related to selecting the loosely and remotely associated concepts and organizing them into creative ideas, whereas the ACC (BA 32) was related to observing and forming distant semantic associations in performing DTTs. The posterior parietal cortex may be involved in the semantic information related to the retrieval and buffering of the formed creative ideas, and several regions in the temporal cortex may be related to the stored long-term memory. In addition, the ALE results of the structural studies showed that divergent thinking was related to the dopaminergic system (e.g., left caudate and claustrum). Based on the ALE results, both fMRI and structural MRI studies could uncover the neural basis of divergent thinking from different aspects (e.g., specific cognitive processing and stable individual difference of cognitive capability). © 2015 Wiley Periodicals, Inc.

  3. Joint programmes in paediatric cardiothoracic surgery: a survey and descriptive analysis.

    PubMed

    DeCampli, William M

    2011-12-01

    Joint programmes, as opposed to regionalisation of paediatric cardiac care, may improve outcomes while preserving accessibility. We determined the prevalence and nature of joint programmes. We sent an online survey to 125 paediatric cardiac surgeons in the United States in November, 2009 querying the past or present existence of a joint programme, its mission, structure, function, and perceived success. A total of 65 surgeon responses from 65 institutions met the criteria for inclusion. Of the 65 institutions, 22 currently or previously conducted a joint programme. Compared with primary institutions, partner institutions were less often children's hospitals (p = 0.0004), had fewer paediatric beds (p = 0.005), and performed fewer cardiac cases (p = 0.03). Approximately 47% of partner hospitals performed fewer than 50 cases per year. The median distance range between hospitals was 41-60 miles, ranging from 5 to 1000 miles. Approximately 54% of partner hospitals had no surgeon working primarily on-site, and 31% of the programmes conducted joint conferences. Approximately 67% of the programmes limited the complexity of cases at the partner hospital, and 83% of the programmes had formal contracts between hospitals. Of the six programmes whose main mission was to increase referrals to the primary hospital, three were felt to have failed. Of the nine programmes whose mission was to increase regional quality, eight were felt to be successful. Joint programmes in paediatric cardiac surgery are common but are heterogeneous in structure and function. Programmes whose mission is to improve the quality of regional care seem more likely to succeed. Joint programmes may be a practical alternative to regionalisation to achieve better outcomes.

  4. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  5. Design and fabrication of realistic adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Shyprykevich, P.

    1983-01-01

    Eighteen bonded joint test specimens representing three different designs of a composite wing chordwise bonded splice were designed and fabricated using current aircraft industry practices. Three types of joints (full wing laminate penetration, two side stepped; midthickness penetration, one side stepped; and partial penetration, scarfed) were analyzed using state of the art elastic joint analysis modified for plastic behavior of the adhesive. The static tensile fail load at room temperature was predicted to be: (1) 1026 kN/m (5860 1b/in) for the two side stepped joint; (2) 925 kN/m (5287 1b/in) for the one side stepped joint; and (3) 1330 kN/m (7600 1b/in) for the scarfed joint. All joints were designed to fail in the adhesive.

  6. An inelastic analysis of a welded aluminum joint

    NASA Astrophysics Data System (ADS)

    Vaughan, R. E.

    1994-09-01

    Butt-weld joints are most commonly designed into pressure vessels which then become as reliable as the weakest increment in the weld chain. In practice, weld material properties are determined from tensile test specimen and provided to the stress analyst in the form of a stress versus strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of welds. The weld specimens are analyzed using classical elastic and plastic theory to provide a basis for modeling the inelastic properties in a finite-element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of prediction methods. The weld considered in this study is a multiple-pass aluminum 2219-T87 butt weld with thickness of 1.40 in. The weld specimen is modeled using the finite-element code ABAQUS. The finite-element model is used to produce the stress-strain behavior in the elastic and plastic regimes and to determine Poisson's ratio in the plastic region. The value of Poisson's ratio in the plastic regime is then compared to experimental data. The results of the comparisons are used to explain multipass weld behavior and to make recommendations concerning the analysis and testing of welds.

  7. An inelastic analysis of a welded aluminum joint

    NASA Technical Reports Server (NTRS)

    Vaughan, R. E.

    1994-01-01

    Butt-weld joints are most commonly designed into pressure vessels which then become as reliable as the weakest increment in the weld chain. In practice, weld material properties are determined from tensile test specimen and provided to the stress analyst in the form of a stress versus strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of welds. The weld specimens are analyzed using classical elastic and plastic theory to provide a basis for modeling the inelastic properties in a finite-element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of prediction methods. The weld considered in this study is a multiple-pass aluminum 2219-T87 butt weld with thickness of 1.40 in. The weld specimen is modeled using the finite-element code ABAQUS. The finite-element model is used to produce the stress-strain behavior in the elastic and plastic regimes and to determine Poisson's ratio in the plastic region. The value of Poisson's ratio in the plastic regime is then compared to experimental data. The results of the comparisons are used to explain multipass weld behavior and to make recommendations concerning the analysis and testing of welds.

  8. Synthesizing Regression Results: A Factored Likelihood Method

    ERIC Educational Resources Information Center

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-01-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

  9. Preliminary results on the fracture analysis of multi-site cracking of lap joints in aircraft skins

    NASA Astrophysics Data System (ADS)

    Beuth, J. L., Jr.; Hutchinson, John W.

    1992-07-01

    Results of a fracture mechanics analysis relevant to fatigue crack growth at rivets in lap joints of aircraft skins are presented. Multi-site damage (MSD) is receiving increased attention within the context of problems of aging aircraft. Fracture analyses previously carried out include small-scale modeling of rivet/skin interactions, larger-scale two-dimensional models of lap joints similar to that developed here, and full scale three-dimensional models of large portions of the aircraft fuselage. Fatigue testing efforts have included flat coupon specimens, two-dimensional lap joint tests, and full scale tests on specimens designed to closely duplicate aircraft sections. Most of this work is documented in the proceedings of previous symposia on the aging aircraft problem. The effect MSD has on the ability of skin stiffeners to arrest the growth of long skin cracks is a particularly important topic that remains to be addressed. One of the most striking features of MSD observed in joints of some test sections and in the joints of some of the older aircraft fuselages is the relative uniformity of the fatigue cracks from rivet to rivet along an extended row of rivets. This regularity suggests that nucleation of the cracks must not be overly difficult. Moreover, it indicates that there is some mechanism which keeps longer cracks from running away from shorter ones, or, equivalently, a mechanism for shorter cracks to catch-up with longer cracks. This basic mechanism has not been identified, and one of the objectives of the work is to see to what extent the mechanism is revealed by a fracture analysis of the MSD cracks. Another related aim is to present accurate stress intensity factor variations with crack length which can be used to estimate fatigue crack growth lifetimes once cracks have been initiated. Results are presented which illustrate the influence of load shedding from rivets with long cracks to neighboring rivets with shorter cracks. Results are also included

  10. Preliminary results on the fracture analysis of multi-site cracking of lap joints in aircraft skins

    NASA Technical Reports Server (NTRS)

    Beuth, J. L., Jr.; Hutchinson, John W.

    1992-01-01

    Results of a fracture mechanics analysis relevant to fatigue crack growth at rivets in lap joints of aircraft skins are presented. Multi-site damage (MSD) is receiving increased attention within the context of problems of aging aircraft. Fracture analyses previously carried out include small-scale modeling of rivet/skin interactions, larger-scale two-dimensional models of lap joints similar to that developed here, and full scale three-dimensional models of large portions of the aircraft fuselage. Fatigue testing efforts have included flat coupon specimens, two-dimensional lap joint tests, and full scale tests on specimens designed to closely duplicate aircraft sections. Most of this work is documented in the proceedings of previous symposia on the aging aircraft problem. The effect MSD has on the ability of skin stiffeners to arrest the growth of long skin cracks is a particularly important topic that remains to be addressed. One of the most striking features of MSD observed in joints of some test sections and in the joints of some of the older aircraft fuselages is the relative uniformity of the fatigue cracks from rivet to rivet along an extended row of rivets. This regularity suggests that nucleation of the cracks must not be overly difficult. Moreover, it indicates that there is some mechanism which keeps longer cracks from running away from shorter ones, or, equivalently, a mechanism for shorter cracks to catch-up with longer cracks. This basic mechanism has not been identified, and one of the objectives of the work is to see to what extent the mechanism is revealed by a fracture analysis of the MSD cracks. Another related aim is to present accurate stress intensity factor variations with crack length which can be used to estimate fatigue crack growth lifetimes once cracks have been initiated. Results are presented which illustrate the influence of load shedding from rivets with long cracks to neighboring rivets with shorter cracks. Results are also included

  11. Macro-level vulnerable road users crash analysis: A Bayesian joint modeling approach of frequency and proportion.

    PubMed

    Cai, Qing; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2017-10-01

    This study aims at contributing to the literature on pedestrian and bicyclist safety by building on the conventional count regression models to explore exogenous factors affecting pedestrian and bicyclist crashes at the macroscopic level. In the traditional count models, effects of exogenous factors on non-motorist crashes were investigated directly. However, the vulnerable road users' crashes are collisions between vehicles and non-motorists. Thus, the exogenous factors can affect the non-motorist crashes through the non-motorists and vehicle drivers. To accommodate for the potentially different impact of exogenous factors we convert the non-motorist crash counts as the product of total crash counts and proportion of non-motorist crashes and formulate a joint model of the negative binomial (NB) model and the logit model to deal with the two parts, respectively. The formulated joint model is estimated using non-motorist crash data based on the Traffic Analysis Districts (TADs) in Florida. Meanwhile, the traditional NB model is also estimated and compared with the joint model. The result indicates that the joint model provides better data fit and can identify more significant variables. Subsequently, a novel joint screening method is suggested based on the proposed model to identify hot zones for non-motorist crashes. The hot zones of non-motorist crashes are identified and divided into three types: hot zones with more dangerous driving environment only, hot zones with more hazardous walking and cycling conditions only, and hot zones with both. It is expected that the joint model and screening method can help decision makers, transportation officials, and community planners to make more efficient treatments to proactively improve pedestrian and bicyclist safety. Published by Elsevier Ltd.

  12. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford C.

    2011-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.

  13. [Correlation analysis on the disorders of patella-femoral joint and torsional deformity of tibia].

    PubMed

    Sun, Zhen-Jie; Yuan, Yi; Liu, Rui-Bo

    2015-03-01

    tibia is always accompanied by instability of patella-femoral joint,especially under the dynamic condition, thus causing PFDA. It can not only provide arrangement information and degenerative condition of patella-femoral joint,but also provide guidance through the analysis on the relationship for better clinical prevention and early treatment of degenerative bone and joint disease.

  14. Direct measurement of group delay with joint time-frequency analysis of a white-light spectral interferogram.

    PubMed

    Deng, Yuqiang; Yang, Weijian; Zhou, Chun; Wang, Xi; Tao, Jun; Kong, Weipeng; Zhang, Zhigang

    2008-12-01

    We propose and demonstrate an analysis method to directly extract the group delay rather than the phase from the white-light spectral interferogram. By the joint time-frequency analysis technique, group delay is directly read from the ridge of wavelet transform, and group-delay dispersion is easily obtained by additional differentiation. The technique shows reasonable potential for the characterization of ultra-broadband chirped mirrors.

  15. A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Cheng

    2016-03-12

    A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less

  16. The Neural Bases of Difficult Speech Comprehension and Speech Production: Two Activation Likelihood Estimation (ALE) Meta-Analyses

    ERIC Educational Resources Information Center

    Adank, Patti

    2012-01-01

    The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…

  17. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  18. Experimental analysis of mechanical joints strength by means of energy dissipation

    NASA Astrophysics Data System (ADS)

    Wolf, Alexander; Lafarge, Remi; Kühn, Tino; Brosius, Alexander

    2018-05-01

    Designing complex structures with the demand for weight reduction leads directly to a multi-material concept. This mixture has to be joined securely and welding, mechanical joining and the usage of adhesives are commonly used for that purpose. Sometimes also a mix of at least two materials is useful to combine the individual advantages. The challenge is the non-destructive testing of these connections because destructive testing requires a lot of preparation and expensive testing equipment. The authors show a testing method by measuring and analysing the energy dissipation in mechanical joints. Known methods are radiography, thermography and ultrasound testing. Unfortunately, the usage of these methods is difficult and often not usable in fibre-reinforced-plastics. The presented approach measures the propagation of the elastic strain wave through the joint. A defined impact strain is detected with by strain-gauges whereby the transmitter is located on one side of the joint and the receiver on the other, respectively. Because of different mechanisms, energy dissipates by passing the joint areas. Main reasons are damping caused by friction and material specific damping. Insufficient performed joints lead to an effect especially in the friction damping. By the measurement of the different strains and the resulting energy loss a statement to the connection quality is given. The possible defect during the execution of the joint can be identified by the energy loss and strain vs. time curve. After the description of the method, the authors present the results of energy dissipation measurements at a bolted assembly with different locking torques. By the adjustable tightening torques for the screw connections easily a variation of the contact pressure can be applied and analysed afterwards. The outlook will give a statement for the usability for other mechanical joints and fibre-reinforced-plastics.

  19. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  20. Principal component analysis in construction of 3D human knee joint models using a statistical shape model method.

    PubMed

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2015-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the three-dimensional (3D) joint surface model has been reported in the literature. In this study, we constructed a SSM database using 152 human computed tomography (CT) knee joint models, including the femur, tibia and patella and analysed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 s using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus, it may have a broad application in computer-assisted knee surgeries that require 3D surface models of the knee.

  1. A general model for likelihood computations of genetic marker data accounting for linkage, linkage disequilibrium, and mutations.

    PubMed

    Kling, Daniel; Tillmar, Andreas; Egeland, Thore; Mostad, Petter

    2015-09-01

    Several applications necessitate an unbiased determination of relatedness, be it in linkage or association studies or in a forensic setting. An appropriate model to compute the joint probability of some genetic data for a set of persons given some hypothesis about the pedigree structure is then required. The increasing number of markers available through high-density SNP microarray typing and NGS technologies intensifies the demand, where using a large number of markers may lead to biased results due to strong dependencies between closely located loci, both within pedigrees (linkage) and in the population (allelic association or linkage disequilibrium (LD)). We present a new general model, based on a Markov chain for inheritance patterns and another Markov chain for founder allele patterns, the latter allowing us to account for LD. We also demonstrate a specific implementation for X chromosomal markers that allows for computation of likelihoods based on hypotheses of alleged relationships and genetic marker data. The algorithm can simultaneously account for linkage, LD, and mutations. We demonstrate its feasibility using simulated examples. The algorithm is implemented in the software FamLinkX, providing a user-friendly GUI for Windows systems (FamLinkX, as well as further usage instructions, is freely available at www.famlink.se ). Our software provides the necessary means to solve cases where no previous implementation exists. In addition, the software has the possibility to perform simulations in order to further study the impact of linkage and LD on computed likelihoods for an arbitrary set of markers.

  2. Insecticide resistance, control failure likelihood and the First Law of Geography.

    PubMed

    Guedes, Raul Narciso C

    2017-03-01

    Insecticide resistance is a broadly recognized ecological backlash resulting from insecticide use and is widely reported among arthropod pest species with well-recognized underlying mechanisms and consequences. Nonetheless, insecticide resistance is the subject of evolving conceptual views that introduces a different concept useful if recognized in its own right - the risk or likelihood of control failure. Here we suggest an experimental approach to assess the likelihood of control failure of an insecticide allowing for consistent decision-making regarding management of insecticide resistance. We also challenge the current emphasis on limited spatial sampling of arthropod populations for resistance diagnosis in favor of comprehensive spatial sampling. This necessarily requires larger population sampling - aiming to use spatial analysis in area-wide surveys - to recognize focal points of insecticide resistance and/or control failure that will better direct management efforts. The continuous geographical scale of such surveys will depend on the arthropod pest species, the pattern of insecticide use and many other potential factors. Regardless, distance dependence among sampling sites should still hold, following the maxim that the closer two things are, the more they resemble each other, which is the basis of Tobler's First Law of Geography. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  3. Silence That Can Be Dangerous: A Vignette Study to Assess Healthcare Professionals’ Likelihood of Speaking up about Safety Concerns

    PubMed Central

    Schwappach, David L. B.; Gehring, Katrin

    2014-01-01

    Purpose To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. Patients and Methods 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers’ errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder’s evaluations of the situation and personal characteristics. Results Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%−96% would speak up towards a supervisor failing to check a prescription, 45%−81% would point a coworker to a missed hand disinfection, 82%−94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%−92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. Conclusions Clinicians’ willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns. PMID:25116338

  4. Bayesian experimental design for models with intractable likelihoods.

    PubMed

    Drovandi, Christopher C; Pettitt, Anthony N

    2013-12-01

    In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.

  5. Likelihood Ratios for Glaucoma Diagnosis Using Spectral Domain Optical Coherence Tomography

    PubMed Central

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M.; Weinreb, Robert N.; Medeiros, Felipe A.

    2014-01-01

    Purpose To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral domain optical coherence tomography (spectral-domain OCT). Design Observational cohort study. Methods 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the Receiver Operating Characteristic (ROC) curve. Results Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86μm were associated with positive LRs, i.e., LRs greater than 1; whereas RNFL thickness values higher than 86μm were associated with negative LRs, i.e., LRs smaller than 1. A modified Fagan nomogram was provided to assist calculation of post-test probability of disease from the calculated likelihood ratios and pretest probability of disease. Conclusion The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision-making. PMID:23972303

  6. Engineering studies on joint bar integrity, part II : finite element analysis

    DOT National Transportation Integrated Search

    2014-04-02

    This paper is the second in a two-part series describing : research sponsored by the Federal Railroad Administration : (FRA) to study the structural integrity of joint bars. In Part I, : observations from field surveys of joint bar inspections : cond...

  7. Adaptive Postural Control for Joint Immobilization during Multitask Performance

    PubMed Central

    Hsu, Wei-Li

    2014-01-01

    Motor abundance is an essential feature of adaptive control. The range of joint combinations enabled by motor abundance provides the body with the necessary freedom to adopt different positions, configurations, and movements that allow for exploratory postural behavior. This study investigated the adaptation of postural control to joint immobilization during multi-task performance. Twelve healthy volunteers (6 males and 6 females; 21–29 yr) without any known neurological deficits, musculoskeletal conditions, or balance disorders participated in this study. The participants executed a targeting task, alone or combined with a ball-balancing task, while standing with free or restricted joint motions. The effects of joint configuration variability on center of mass (COM) stability were examined using uncontrolled manifold (UCM) analysis. The UCM method separates joint variability into two components: the first is consistent with the use of motor abundance, which does not affect COM position (VUCM); the second leads to COM position variability (VORT). The analysis showed that joints were coordinated such that their variability had a minimal effect on COM position. However, the component of joint variability that reflects the use of motor abundance to stabilize COM (VUCM) was significant decreased when the participants performed the combined task with immobilized joints. The component of joint variability that leads to COM variability (VORT) tended to increase with a reduction in joint degrees of freedom. The results suggested that joint immobilization increases the difficulty of stabilizing COM when multiple tasks are performed simultaneously. These findings are important for developing rehabilitation approaches for patients with limited joint movements. PMID:25329477

  8. A Comparison of a Bayesian and a Maximum Likelihood Tailored Testing Procedure.

    ERIC Educational Resources Information Center

    McKinley, Robert L.; Reckase, Mark D.

    A study was conducted to compare tailored testing procedures based on a Bayesian ability estimation technique and on a maximum likelihood ability estimation technique. The Bayesian tailored testing procedure selected items so as to minimize the posterior variance of the ability estimate distribution, while the maximum likelihood tailored testing…

  9. The contribution of quasi-joint stiffness of the ankle joint to gait in patients with hemiparesis.

    PubMed

    Sekiguchi, Yusuke; Muraki, Takayuki; Kuramatsu, Yuko; Furusawa, Yoshihito; Izumi, Shin-Ichi

    2012-06-01

    The role of ankle joint stiffness during gait in patients with hemiparesis has not been clarified. The purpose of this study was to determine the contribution of quasi-joint stiffness of the ankle joint to spatiotemporal and kinetic parameters regarding gait in patients with hemiparesis due to brain tumor or stroke and healthy individuals. Spatiotemporal and kinetic parameters regarding gait in twelve patients with hemiparesis due to brain tumor or stroke and nine healthy individuals were measured with a 3-dimensional motion analysis system. Quasi-joint stiffness was calculated from the slope of the linear regression of the moment-angle curve of the ankle joint during the second rocker. There was no significant difference in quasi-joint stiffness among both sides of patients and the right side of controls. Quasi-joint stiffness on the paretic side of patients with hemiparesis positively correlated with maximal ankle power (r=0.73, P<0.01) and gait speed (r=0.66, P<0.05). In contrast, quasi-joint stiffness in controls negatively correlated with maximal ankle power (r=-0.73, P<0.05) and gait speed (r=-0.76, P<0.05). Our findings suggested that ankle power during gait might be generated by increasing quasi-joint stiffness in patients with hemiparesis. In contrast, healthy individuals might decrease quasi-joint stiffness to avoid deceleration of forward tilt of the tibia. Our findings might be useful for selecting treatment for increased ankle stiffness due to contracture and spasticity in patients with hemiparesis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Likelihood-Ratio DIF Testing: Effects of Nonnormality

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2008-01-01

    Differential item functioning (DIF) occurs when an item has different measurement properties for members of one group versus another. Likelihood-ratio (LR) tests for DIF based on item response theory (IRT) involve statistically comparing IRT models that vary with respect to their constraints. A simulation study evaluated how violation of the…

  11. Application of maximum likelihood methods to laser Thomson scattering measurements of low density plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washeleski, Robert L.; Meyer, Edmond J. IV; King, Lyon B.

    2013-10-15

    Laser Thomson scattering (LTS) is an established plasma diagnostic technique that has seen recent application to low density plasmas. It is difficult to perform LTS measurements when the scattered signal is weak as a result of low electron number density, poor optical access to the plasma, or both. Photon counting methods are often implemented in order to perform measurements in these low signal conditions. However, photon counting measurements performed with photo-multiplier tubes are time consuming and multi-photon arrivals are incorrectly recorded. In order to overcome these shortcomings a new data analysis method based on maximum likelihood estimation was developed. Themore » key feature of this new data processing method is the inclusion of non-arrival events in determining the scattered Thomson signal. Maximum likelihood estimation and its application to Thomson scattering at low signal levels is presented and application of the new processing method to LTS measurements performed in the plume of a 2-kW Hall-effect thruster is discussed.« less

  12. Application of maximum likelihood methods to laser Thomson scattering measurements of low density plasmas.

    PubMed

    Washeleski, Robert L; Meyer, Edmond J; King, Lyon B

    2013-10-01

    Laser Thomson scattering (LTS) is an established plasma diagnostic technique that has seen recent application to low density plasmas. It is difficult to perform LTS measurements when the scattered signal is weak as a result of low electron number density, poor optical access to the plasma, or both. Photon counting methods are often implemented in order to perform measurements in these low signal conditions. However, photon counting measurements performed with photo-multiplier tubes are time consuming and multi-photon arrivals are incorrectly recorded. In order to overcome these shortcomings a new data analysis method based on maximum likelihood estimation was developed. The key feature of this new data processing method is the inclusion of non-arrival events in determining the scattered Thomson signal. Maximum likelihood estimation and its application to Thomson scattering at low signal levels is presented and application of the new processing method to LTS measurements performed in the plume of a 2-kW Hall-effect thruster is discussed.

  13. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  14. Femur rotation and patellofemoral joint kinematics: a weight-bearing magnetic resonance imaging analysis.

    PubMed

    Souza, Richard B; Draper, Christie E; Fredericson, Michael; Powers, Christopher M

    2010-05-01

    Controlled laboratory study using a cross-sectional design. To compare patellofemoral joint kinematics, femoral rotation, and patella rotation between females with patellofemoral pain (PFP) and pain-free controls using weight-bearing kinematic magnetic resonance imaging. Recently, it has been recognized that patellofemoral malalignment may be the result of femoral motion as opposed to patella motion. Fifteen females with PFP and 15 pain-free females between the ages of 18 and 45 years participated in this study. Kinematic imaging of the patellofemoral joint was performed using a vertically open magnetic resonance imaging system. Axial-oblique images were obtained using a fast gradient-echo pulse sequence. Images were acquired at a rate of 1 image per second while subjects performed a single-limb squat. Measures of femur and patella rotation (relative to the image field of view), lateral patella tilt, and lateral patella displacement were made from images obtained at 45 degrees , 30 degrees , 15 degrees , and 0 degrees of knee flexion. Group differences were assessed using a mixed-model analysis of variance with repeated measures. When compared to the control group, females with PFP demonstrated significantly greater lateral patella displacement at all angles evaluated and significantly greater lateral patella tilt at 30 degrees , 15 degrees , and 0 degrees of knee flexion. Similarly, greater medial femoral rotation was observed in the PFP group at 45 degrees , 15 degrees , and 0 degrees of knee flexion when compared to the control group. No group differences in patella rotation were found. Altered patellofemoral joint kinematics in females with PFP appears to be related to excessive medial femoral rotation, as opposed to lateral patella rotation. Our results suggest that the control of femur rotation may be important in restoring normal patellofemoral joint kinematics. J Orthop Sports Phys Ther 2010;40(5):277-285, Epub 12 March 2010. doi:10.2519/jospt.2010.3215.

  15. Modeling short duration extreme precipitation patterns using copula and generalized maximum pseudo-likelihood estimation with censoring

    NASA Astrophysics Data System (ADS)

    Bargaoui, Zoubeida Kebaili; Bardossy, Andràs

    2015-10-01

    The paper aims to develop researches on the spatial variability of heavy rainfall events estimation using spatial copula analysis. To demonstrate the methodology, short time resolution rainfall time series from Stuttgart region are analyzed. They are constituted by rainfall observations on continuous 30 min time scale recorded over a network composed by 17 raingages for the period July 1989-July 2004. The analysis is performed aggregating the observations from 30 min up to 24 h. Two parametric bivariate extreme copula models, the Husler-Reiss model and the Gumbel model are investigated. Both involve a single parameter to be estimated. Thus, model fitting is operated for every pair of stations for a giving time resolution. A rainfall threshold value representing a fixed rainfall quantile is adopted for model inference. Generalized maximum pseudo-likelihood estimation is adopted with censoring by analogy with methods of univariate estimation combining historical and paleoflood information with systematic data. Only pairs of observations greater than the threshold are assumed as systematic data. Using the estimated copula parameter, a synthetic copula field is randomly generated and helps evaluating model adequacy which is achieved using Kolmogorov Smirnov distance test. In order to assess dependence or independence in the upper tail, the extremal coefficient which characterises the tail of the joint bivariate distribution is adopted. Hence, the extremal coefficient is reported as a function of the interdistance between stations. If it is less than 1.7, stations are interpreted as dependent in the extremes. The analysis of the fitted extremal coefficients with respect to stations inter distance highlights two regimes with different dependence structures: a short spatial extent regime linked to short duration intervals (from 30 min to 6 h) with an extent of about 8 km and a large spatial extent regime related to longer rainfall intervals (from 12 h to 24 h) with an

  16. The Atacama Cosmology Telescope: Likelihood for Small-Scale CMB Data

    NASA Technical Reports Server (NTRS)

    Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G. E.; Battaglia, N.; Battistelli, E. S.; Bond, J. R.; Das, S.; Devlin, M. J.; Dunner, R.; hide

    2013-01-01

    The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a likelihood function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the likelihood to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with ?2/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency likelihood to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' likelihood in the range 500 < l < 3500 for use in cosmological parameter estimation

  17. Static Analysis of Large-Scale Multibody System Using Joint Coordinates and Spatial Algebra Operator

    PubMed Central

    Omar, Mohamed A.

    2014-01-01

    Initial transient oscillations inhibited in the dynamic simulations responses of multibody systems can lead to inaccurate results, unrealistic load prediction, or simulation failure. These transients could result from incompatible initial conditions, initial constraints violation, and inadequate kinematic assembly. Performing static equilibrium analysis before the dynamic simulation can eliminate these transients and lead to stable simulation. Most exiting multibody formulations determine the static equilibrium position by minimizing the system potential energy. This paper presents a new general purpose approach for solving the static equilibrium in large-scale articulated multibody. The proposed approach introduces an energy drainage mechanism based on Baumgarte constraint stabilization approach to determine the static equilibrium position. The spatial algebra operator is used to express the kinematic and dynamic equations of the closed-loop multibody system. The proposed multibody system formulation utilizes the joint coordinates and modal elastic coordinates as the system generalized coordinates. The recursive nonlinear equations of motion are formulated using the Cartesian coordinates and the joint coordinates to form an augmented set of differential algebraic equations. Then system connectivity matrix is derived from the system topological relations and used to project the Cartesian quantities into the joint subspace leading to minimum set of differential equations. PMID:25045732

  18. Taking two to tango: fMRI analysis of improvised joint action with physical contact

    PubMed Central

    Belyk, Michel; Brown, Steven

    2018-01-01

    Many forms of joint action involve physical coupling between the participants, such as when moving a sofa together or dancing a tango. We report the results of a novel two-person functional MRI study in which trained couple dancers engaged in bimanual contact with an experimenter standing next to the bore of the magnet, and in which the two alternated between being the leader and the follower of joint improvised movements. Leading showed a general pattern of self-orientation, being associated with brain areas involved in motor planning, navigation, sequencing, action monitoring, and error correction. In contrast, following showed a far more sensory, externally-oriented pattern, revealing areas involved in somatosensation, proprioception, motion tracking, social cognition, and outcome monitoring. We also had participants perform a “mutual” condition in which the movement patterns were pre-learned and the roles were symmetric, thereby minimizing any tendency toward either leading or following. The mutual condition showed greater activity in brain areas involved in mentalizing and social reward than did leading or following. Finally, the analysis of improvisation revealed the dual importance of motor-planning and working-memory areas. We discuss these results in terms of theories of both joint action and improvisation. PMID:29324862

  19. Cosmological constraints from a joint analysis of cosmic growth and expansion

    NASA Astrophysics Data System (ADS)

    Moresco, M.; Marulli, F.

    2017-10-01

    Combining measurements on the expansion history of the Universe and on the growth rate of cosmic structures is key to discriminate between alternative cosmological frameworks and to test gravity. Recently, Linder proposed a new diagram to investigate the joint evolutionary track of these two quantities. In this letter, we collect the most recent cosmic growth and expansion rate data sets to provide the state-of-the-art observational constraints on this diagram. By performing a joint statistical analysis of both probes, we test the standard Λcold dark matter model, confirming a mild tension between cosmic microwave background predictions from Planck mission and cosmic growth measurements at low redshift (z < 2). Then we test alternative models allowing the variation of one single cosmological parameter at a time. In particular, we find a larger growth index than the one predicted by general relativity γ =0.65^{+0.05}_{-0.04}. However, also a standard model with total neutrino mass of 0.26 ± 0.10 eV provides a similarly accurate description of the current data. By simulating an additional data set consistent with next-generation dark-energy mission forecasts, we show that growth rate constraints at z > 1 will be crucial to discriminate between alternative models.

  20. Effect of Reduced Stiffness Dance Flooring on Lower Extremity Joint Angular Trajectories During a Ballet Jump.

    PubMed

    Hackney, James; Brummel, Sara; Newman, Mary; Scott, Shannon; Reinagel, Matthew; Smith, Jennifer

    2015-09-01

    We carried out a study to investigate how low stiffness flooring may help prevent overuse injuries of the lower extremity in dancers. It was hypothesized that performing a ballet jump (sauté) on a reduced stiffness dance floor would decrease maximum joint flexion angles and negative angular velocities at the hips, knees, or ankles compared to performing the same jump on a harder floor. The participants were 15 young adult female dancers (age range 18 to 28, mean = 20.89 ± 2.93 years) with at least 5 years of continuous ballet experience and without history of serious lower body injury, surgery, or recent pain. They performed sautés on a (low stiffness) Harlequin ® WoodSpring Floor and on a vinyl-covered hardwood on concrete floor. Maximum joint flexion angles and negative velocities at bilateral hips, knees, and ankles were measured with the "Ariel Performance Analysis System" (APAS). Paired one-tailed t-tests yielded significant decreases in maximum knee angle (average decrease = 3.4° ± 4.2°, p = 0.026) and angular negative velocity of the ankles (average decrease = 18.7°/sec ± 27.9°/sec, p = 0.009) with low stiffness flooring. If the knee angle is less acute, then the length of the external knee flexion moment arm will also be shorter and result in a smaller external knee flexion moment, given an equal landing force. Also, high velocities of eccentric muscle contraction, which are necessary to control negative angular velocity of the ankle joint, are associated with higher risk of musculotendinous injury. Hence, our findings indicate that reduced floor stiffness may indeed help decrease the likelihood of lower extremity injuries.

  1. No evidence for the use of DIR, D–D fusions, chromosome 15 open reading frames or VHreplacement in the peripheral repertoire was found on application of an improved algorithm, JointML, to 6329 human immunoglobulin H rearrangements

    PubMed Central

    Ohm-Laursen, Line; Nielsen, Morten; Larsen, Stine R; Barington, Torben

    2006-01-01

    Antibody diversity is created by imprecise joining of the variability (V), diversity (D) and joining (J) gene segments of the heavy and light chain loci. Analysis of rearrangements is complicated by somatic hypermutations and uncertainty concerning the sources of gene segments and the precise way in which they recombine. It has been suggested that D genes with irregular recombination signal sequences (DIR) and chromosome 15 open reading frames (OR15) can replace conventional D genes, that two D genes or inverted D genes may be used and that the repertoire can be further diversified by heavy chain V gene (VH) replacement. Safe conclusions require large, well-defined sequence samples and algorithms minimizing stochastic assignment of segments. Two computer programs were developed for analysis of heavy chain joints. JointHMM is a profile hidden Markow model, while JointML is a maximum-likelihood-based method taking the lengths of the joint and the mutational status of the VH gene into account. The programs were applied to a set of 6329 clonally unrelated rearrangements. A conventional D gene was found in 80% of unmutated sequences and 64% of mutated sequences, while D-gene assignment was kept below 5% in artificial (randomly permutated) rearrangements. No evidence for the use of DIR, OR15, multiple D genes or VH replacements was found, while inverted D genes were used in less than 1‰ of the sequences. JointML was shown to have a higher predictive performance for D-gene assignment in mutated and unmutated sequences than four other publicly available programs. An online version 1·0 of JointML is available at http://www.cbs.dtu.dk/services/VDJsolver. PMID:17005006

  2. A novel hybrid joining methodology for composite to steel joints

    NASA Astrophysics Data System (ADS)

    Sarh, Bastian

    This research has established a novel approach for designing, analyzing, and fabricating load bearing structural connections between resin infused composite materials and components made of steel or other metals or alloys. A design philosophy is proposed wherein overlapping joint sections comprised of fiber reinforced plastics (FRP's) and steel members are connected via a combination of adhesive bonding and integrally placed composite pins. A film adhesive is utilized, placed into the dry stack prior to resin infusion and is cured after infusion through either local heat elements or by placing the structure into an oven. The novel manner in which the composite pins are introduced consists of perforating the steel member with holes and placing pre-formed composite pins through them, also prior to resin infusion of the composite section. In this manner joints are co-molded structures such that secondary processing is eliminated. It is shown that such joints blend the structural benefits of adhesive and mechanically connected joints, and that the fabrication process is feasible for low-cost, large-scale production as applicable to the shipbuilding industry. Analysis procedures used for designing such joints are presented consisting of an adhesive joint design theory and a pin placement theory. These analysis tools are used in the design of specimens, specific designs are fabricated, and these evaluated through structural tests. Structural tests include quasi-static loading and low cycle fatigue evaluation. This research has thereby invented a novel philosophy on joints, created the manufacturing technique for fabricating such joints, established simple to apply analysis procedures used in the design of such joints (consisting of both an adhesive and a pin placement analysis), and has validated the methodology through specimen fabrication and testing.

  3. Digital tomosynthesis rendering of joint margins for arthritis assessment

    NASA Astrophysics Data System (ADS)

    Duryea, Jeffrey W.; Neumann, Gesa; Yoshioka, Hiroshi; Dobbins, James T., III

    2004-05-01

    PURPOSE: Rheumatoid arthritis (RA) of the hand is a significant healthcare problem. Techniques to accurately quantity the structural changes from RA are crucial for the development and prescription of therapies. Analysis of radiographic joint space width (JSW) is widely used and has demonstrated promise. However, radiography presents a 2D view of the joint. In this study we performed tomosynthesis reconstructions of proximal interphalangeal (PIP), and metacarpophalangeal (MCP) joints to measure the 3D joint structure. METHODS: We performed a reader study using simulated radiographs of 12 MCP and 12 PIP joints from skeletal specimens imaged with micro-CT. The tomosynthesis technique provided images of reconstructed planes with 0.75 mm spacing, which were presented to 2 readers with a computer tool. The readers were instructed to delineate the joint surfaces on tomosynthetic slices where they could visualize the margins. We performed a quantitative analysis of 5 slices surrounding the central portion of each joint. Reader-determined JSW was compared to a gold standard. As a figure of merit we calculated the average root-mean square deviation (RMSD). RESULTS: RMSD was 0.22 mm for both joints. For the individual joints, RMSD was 0.18 mm (MCP), and 0.26 mm (PIP). The reduced performance for the smaller PIP joints suggests that a slice spacing less than 0.75 mm may be more appropriate. CONCLUSIONS: We have demonstrated the capability of limited 3D rendering of joint surfaces using digital tomosynthesis. This technique promises to provide an improved method to visualize the structural changes of RA.

  4. Selected aspects of prior and likelihood information for a Bayesian classifier in a road safety analysis.

    PubMed

    Nowakowska, Marzena

    2017-04-01

    The development of the Bayesian logistic regression model classifying the road accident severity is discussed. The already exploited informative priors (method of moments, maximum likelihood estimation, and two-stage Bayesian updating), along with the original idea of a Boot prior proposal, are investigated when no expert opinion has been available. In addition, two possible approaches to updating the priors, in the form of unbalanced and balanced training data sets, are presented. The obtained logistic Bayesian models are assessed on the basis of a deviance information criterion (DIC), highest probability density (HPD) intervals, and coefficients of variation estimated for the model parameters. The verification of the model accuracy has been based on sensitivity, specificity and the harmonic mean of sensitivity and specificity, all calculated from a test data set. The models obtained from the balanced training data set have a better classification quality than the ones obtained from the unbalanced training data set. The two-stage Bayesian updating prior model and the Boot prior model, both identified with the use of the balanced training data set, outperform the non-informative, method of moments, and maximum likelihood estimation prior models. It is important to note that one should be careful when interpreting the parameters since different priors can lead to different models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Likelihood testing of seismicity-based rate forecasts of induced earthquakes in Oklahoma and Kansas

    USGS Publications Warehouse

    Moschetti, Morgan P.; Hoover, Susan M.; Mueller, Charles

    2016-01-01

    Likelihood testing of induced earthquakes in Oklahoma and Kansas has identified the parameters that optimize the forecasting ability of smoothed seismicity models and quantified the recent temporal stability of the spatial seismicity patterns. Use of the most recent 1-year period of earthquake data and use of 10–20-km smoothing distances produced the greatest likelihood. The likelihood that the locations of January–June 2015 earthquakes were consistent with optimized forecasts decayed with increasing elapsed time between the catalogs used for model development and testing. Likelihood tests with two additional sets of earthquakes from 2014 exhibit a strong sensitivity of the rate of decay to the smoothing distance. Marked reductions in likelihood are caused by the nonstationarity of the induced earthquake locations. Our results indicate a multiple-fold benefit from smoothed seismicity models in developing short-term earthquake rate forecasts for induced earthquakes in Oklahoma and Kansas, relative to the use of seismic source zones.

  6. A maximum pseudo-profile likelihood estimator for the Cox model under length-biased sampling

    PubMed Central

    Huang, Chiung-Yu; Qin, Jing; Follmann, Dean A.

    2012-01-01

    This paper considers semiparametric estimation of the Cox proportional hazards model for right-censored and length-biased data arising from prevalent sampling. To exploit the special structure of length-biased sampling, we propose a maximum pseudo-profile likelihood estimator, which can handle time-dependent covariates and is consistent under covariate-dependent censoring. Simulation studies show that the proposed estimator is more efficient than its competitors. A data analysis illustrates the methods and theory. PMID:23843659

  7. Joint Intelligence Analysis Complex: DOD Needs to Fully Incorporate Best Practices into Future Cost Estimates

    DTIC Science & Technology

    2016-11-01

    Report to Congressional Requesters November 2016 GAO-17-29 United States Government Accountability Office United States Government... Accountability Office Highlights of GAO-17-29, a report to congressional requesters November 2016 JOINT INTELLIGENCE ANALYSIS COMPLEX DOD...of scope, according to DOD and Air Force officials. However, without fully accounting for life- cycle costs, management may have difficulty

  8. Proper joint analysis of summary association statistics requires the adjustment of heterogeneity in SNP coverage pattern.

    PubMed

    Zhang, Han; Wheeler, William; Song, Lei; Yu, Kai

    2017-07-07

    As meta-analysis results published by consortia of genome-wide association studies (GWASs) become increasingly available, many association summary statistics-based multi-locus tests have been developed to jointly evaluate multiple single-nucleotide polymorphisms (SNPs) to reveal novel genetic architectures of various complex traits. The validity of these approaches relies on the accurate estimate of z-score correlations at considered SNPs, which in turn requires knowledge on the set of SNPs assessed by each study participating in the meta-analysis. However, this exact SNP coverage information is usually unavailable from the meta-analysis results published by GWAS consortia. In the absence of the coverage information, researchers typically estimate the z-score correlations by making oversimplified coverage assumptions. We show through real studies that such a practice can generate highly inflated type I errors, and we demonstrate the proper way to incorporate correct coverage information into multi-locus analyses. We advocate that consortia should make SNP coverage information available when posting their meta-analysis results, and that investigators who develop analytic tools for joint analyses based on summary data should pay attention to the variation in SNP coverage and adjust for it appropriately. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  9. Profile-Likelihood Approach for Estimating Generalized Linear Mixed Models with Factor Structures

    ERIC Educational Resources Information Center

    Jeon, Minjeong; Rabe-Hesketh, Sophia

    2012-01-01

    In this article, the authors suggest a profile-likelihood approach for estimating complex models by maximum likelihood (ML) using standard software and minimal programming. The method works whenever setting some of the parameters of the model to known constants turns the model into a standard model. An important class of models that can be…

  10. Cross-Population Joint Analysis of eQTLs: Fine Mapping and Functional Annotation

    PubMed Central

    Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Mapping expression quantitative trait loci (eQTLs) has been shown as a powerful tool to uncover the genetic underpinnings of many complex traits at molecular level. In this paper, we present an integrative analysis approach that leverages eQTL data collected from multiple population groups. In particular, our approach effectively identifies multiple independent cis-eQTL signals that are consistent across populations, accounting for population heterogeneity in allele frequencies and linkage disequilibrium patterns. Furthermore, by integrating genomic annotations, our analysis framework enables high-resolution functional analysis of eQTLs. We applied our statistical approach to analyze the GEUVADIS data consisting of samples from five population groups. From this analysis, we concluded that i) jointly analysis across population groups greatly improves the power of eQTL discovery and the resolution of fine mapping of causal eQTL ii) many genes harbor multiple independent eQTLs in their cis regions iii) genetic variants that disrupt transcription factor binding are significantly enriched in eQTLs (p-value = 4.93 × 10-22). PMID:25906321

  11. A sensitivity analysis method for the body segment inertial parameters based on ground reaction and joint moment regressor matrices.

    PubMed

    Futamure, Sumire; Bonnet, Vincent; Dumas, Raphael; Venture, Gentiane

    2017-11-07

    This paper presents a method allowing a simple and efficient sensitivity analysis of dynamics parameters of complex whole-body human model. The proposed method is based on the ground reaction and joint moment regressor matrices, developed initially in robotics system identification theory, and involved in the equations of motion of the human body. The regressor matrices are linear relatively to the segment inertial parameters allowing us to use simple sensitivity analysis methods. The sensitivity analysis method was applied over gait dynamics and kinematics data of nine subjects and with a 15 segments 3D model of the locomotor apparatus. According to the proposed sensitivity indices, 76 segments inertial parameters out the 150 of the mechanical model were considered as not influent for gait. The main findings were that the segment masses were influent and that, at the exception of the trunk, moment of inertia were not influent for the computation of the ground reaction forces and moments and the joint moments. The same method also shows numerically that at least 90% of the lower-limb joint moments during the stance phase can be estimated only from a force-plate and kinematics data without knowing any of the segment inertial parameters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Numerical Experimentation with Maximum Likelihood Identification in Static Distributed Systems

    NASA Technical Reports Server (NTRS)

    Scheid, R. E., Jr.; Rodriguez, G.

    1985-01-01

    Many important issues in the control of large space structures are intimately related to the fundamental problem of parameter identification. One might also ask how well this identification process can be carried out in the presence of noisy data since no sensor system is perfect. With these considerations in mind the algorithms herein are designed to treat both the case of uncertainties in the modeling and uncertainties in the data. The analytical aspects of maximum likelihood identification are considered in some detail in another paper. The questions relevant to the implementation of these schemes are dealt with, particularly as they apply to models of large space structures. The emphasis is on the influence of the infinite dimensional character of the problem on finite dimensional implementations of the algorithms. Those areas of current and future analysis are highlighted which indicate the interplay between error analysis and possible truncations of the state and parameter spaces.

  13. A Maximum-Likelihood Approach to Force-Field Calibration.

    PubMed

    Zaborowski, Bartłomiej; Jagieła, Dawid; Czaplewski, Cezary; Hałabis, Anna; Lewandowska, Agnieszka; Żmudzińska, Wioletta; Ołdziej, Stanisław; Karczyńska, Agnieszka; Omieczynski, Christian; Wirecki, Tomasz; Liwo, Adam

    2015-09-28

    A new approach to the calibration of the force fields is proposed, in which the force-field parameters are obtained by maximum-likelihood fitting of the calculated conformational ensembles to the experimental ensembles of training system(s). The maximum-likelihood function is composed of logarithms of the Boltzmann probabilities of the experimental conformations, calculated with the current energy function. Because the theoretical distribution is given in the form of the simulated conformations only, the contributions from all of the simulated conformations, with Gaussian weights in the distances from a given experimental conformation, are added to give the contribution to the target function from this conformation. In contrast to earlier methods for force-field calibration, the approach does not suffer from the arbitrariness of dividing the decoy set into native-like and non-native structures; however, if such a division is made instead of using Gaussian weights, application of the maximum-likelihood method results in the well-known energy-gap maximization. The computational procedure consists of cycles of decoy generation and maximum-likelihood-function optimization, which are iterated until convergence is reached. The method was tested with Gaussian distributions and then applied to the physics-based coarse-grained UNRES force field for proteins. The NMR structures of the tryptophan cage, a small α-helical protein, determined at three temperatures (T = 280, 305, and 313 K) by Hałabis et al. ( J. Phys. Chem. B 2012 , 116 , 6898 - 6907 ), were used. Multiplexed replica-exchange molecular dynamics was used to generate the decoys. The iterative procedure exhibited steady convergence. Three variants of optimization were tried: optimization of the energy-term weights alone and use of the experimental ensemble of the folded protein only at T = 280 K (run 1); optimization of the energy-term weights and use of experimental ensembles at all three temperatures (run 2

  14. Factors Associated With the Likelihood of Hospitalization Following Emergency Department Visits for Behavioral Health Conditions.

    PubMed

    Hamilton, Jane E; Desai, Pratikkumar V; Hoot, Nathan R; Gearing, Robin E; Jeong, Shin; Meyer, Thomas D; Soares, Jair C; Begley, Charles E

    2016-11-01

    Behavioral health-related emergency department (ED) visits have been linked with ED overcrowding, an increased demand on limited resources, and a longer length of stay (LOS) due in part to patients being admitted to the hospital but waiting for an inpatient bed. This study examines factors associated with the likelihood of hospital admission for ED patients with behavioral health conditions at 16 hospital-based EDs in a large urban area in the southern United States. Using Andersen's Behavioral Model of Health Service Use for guidance, the study examined the relationship between predisposing (characteristics of the individual, i.e., age, sex, race/ethnicity), enabling (system or structural factors affecting healthcare access), and need (clinical) factors and the likelihood of hospitalization following ED visits for behavioral health conditions (n = 28,716 ED visits). In the adjusted analysis, a logistic fixed-effects model with blockwise entry was used to estimate the relative importance of predisposing, enabling, and need variables added separately as blocks while controlling for variation in unobserved hospital-specific practices across hospitals and time in years. Significant predisposing factors associated with an increased likelihood of hospitalization following an ED visit included increasing age, while African American race was associated with a lower likelihood of hospitalization. Among enabling factors, arrival by emergency transport and a longer ED LOS were associated with a greater likelihood of hospitalization while being uninsured and the availability of community-based behavioral health services within 5 miles of the ED were associated with lower odds. Among need factors, having a discharge diagnosis of schizophrenia/psychotic spectrum disorder, an affective disorder, a personality disorder, dementia, or an impulse control disorder as well as secondary diagnoses of suicidal ideation and/or suicidal behavior increased the likelihood of hospitalization

  15. Extended maximum likelihood halo-independent analysis of dark matter direct detection data

    DOE PAGES

    Gelmini, Graciela B.; Georgescu, Andreea; Gondolo, Paolo; ...

    2015-11-24

    We extend and correct a recently proposed maximum-likelihood halo-independent method to analyze unbinned direct dark matter detection data. Instead of the recoil energy as independent variable we use the minimum speed a dark matter particle must have to impart a given recoil energy to a nucleus. This has the advantage of allowing us to apply the method to any type of target composition and interaction, e.g. with general momentum and velocity dependence, and with elastic or inelastic scattering. We prove the method and provide a rigorous statistical interpretation of the results. As first applications, we find that for dark mattermore » particles with elastic spin-independent interactions and neutron to proton coupling ratio f n/f p=-0.7, the WIMP interpretation of the signal observed by CDMS-II-Si is compatible with the constraints imposed by all other experiments with null results. We also find a similar compatibility for exothermic inelastic spin-independent interactions with f n/f p=-0.8.« less

  16. Early rheumatoid disease. II. Patterns of joint involvement.

    PubMed Central

    Fleming, A; Benn, R T; Corbett, M; Wood, P H

    1976-01-01

    Data from the first research clinic visit (Fleming and others, 1976) have been subjected to factor analysis to identify early patterns of joint involvement. Nine patterns emerged. Two patterns, if present early, were found to have prognostic significance. An eventually more severe disease was associated with a pattern of large joint involvement (shoulder, elbow, wrist, knee) and a pattern based on metatarsophalangeal joints I and III. PMID:970995

  17. Computing Maximum Likelihood Estimates of Loglinear Models from Marginal Sums with Special Attention to Loglinear Item Response Theory.

    ERIC Educational Resources Information Center

    Kelderman, Henk

    1992-01-01

    Describes algorithms used in the computer program LOGIMO for obtaining maximum likelihood estimates of the parameters in loglinear models. These algorithms are also useful for the analysis of loglinear item-response theory models. Presents modified versions of the iterative proportional fitting and Newton-Raphson algorithms. Simulated data…

  18. Analysis of Cracking in Jointed Plain Concrete Pavements

    DOT National Transportation Integrated Search

    2017-03-01

    This paper investigates the trends of longitudinal and transverse cracking in jointed concrete pavements based on Long-Term Pavement Performance (LTPP) Program Strategic Study of Structural Factors for Rigid Pavements (SPS-2) data. The impacts of sla...

  19. The Standard Joint Unit.

    PubMed

    Casajuana Kögel, Cristina; Balcells-Olivero, María Mercedes; López-Pelayo, Hugo; Miquel, Laia; Teixidó, Lídia; Colom, Joan; Nutt, David John; Rehm, Jürgen; Gual, Antoni

    2017-07-01

    Reliable data on cannabis quantities is required to improve assessment of cannabis consumption for epidemiological analysis and clinical assessment, consequently a Standard Joint Unit (SJU) based on quantity of 9-Tetrahydrocannabinol (9-THC) has been established. Naturalistic study of a convenience sample recruited from February 2015-June 2016 in universities, leisure spaces, mental health services and cannabis clubs in Barcelona. Adults, reporting cannabis use in the last 60 days, without cognitive impairment or language barriers, answered a questionnaire on cannabis use and were asked to donate a joint to further determine their 9-THC and Cannabidiol (CBD) content. 492 participants donated 315 valid joints. Donators were on average 29 years old, mostly men (77%), single (75%), with at least secondary studies (73%) and in active employment (63%). Marijuana joints (N=232) contained a median of 6.56mg of 9-THC (Interquartile range-IQR=10,22) and 0.02mg of CBD (IQR=0.02); hashish joints (N=83) a median of 7.94mg of 9-THC (IQR=10,61) and 3.24mg of CBD (IQR=3.21). Participants rolled 4 joints per gram of cannabis and paid 5€ per gram (median values). Consistent 9-THC-content in joints lead to a SJU of 7mg of 9-THC, the integer number closest to the median values shared by both cannabis types. Independently if marijuana or hashish, 1 SJU = 1 joint = 0.25 g of cannabis = 7 mg of 9-THC. For CBD, only hashish SJU contained relevant levels. Similarly to the Standard Drink Unit for alcohol, the SJU is useful for clinical, epidemiological and research purposes. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Joint venture versus outreach: a financial analysis of case studies.

    PubMed

    Forsman, R W

    2001-01-01

    Medical centers across the country are facing cost challenges, and national commercial laboratories are experiencing financial declines that necessitate their capturing market share in any way possible. Many laboratories are turning to joint ventures or partnerships for financial relief. However, it often is in the best interest of the patient and the medical center to integrate laboratory services across the continuum of care. This article analyzes two hypothetical joint ventures involving a laboratory management agreement and full laboratory outsourcing.