Sample records for liability-normal mixture model

  1. Hepatic 3D spheroid models for the detection and study of compounds with cholestatic liability

    PubMed Central

    Hendriks, Delilah F. G.; Fredriksson Puigvert, Lisa; Messner, Simon; Mortiz, Wolfgang; Ingelman-Sundberg, Magnus

    2016-01-01

    Drug-induced cholestasis (DIC) is poorly understood and its preclinical prediction is mainly limited to assessing the compound’s potential to inhibit the bile salt export pump (BSEP). Here, we evaluated two 3D spheroid models, one from primary human hepatocytes (PHH) and one from HepaRG cells, for the detection of compounds with cholestatic liability. By repeatedly co-exposing both models to a set of compounds with different mechanisms of hepatotoxicity and a non-toxic concentrated bile acid (BA) mixture for 8 days we observed a selective synergistic toxicity of compounds known to cause cholestatic or mixed cholestatic/hepatocellular toxicity and the BA mixture compared to exposure to the compounds alone, a phenomenon that was more pronounced after extending the exposure time to 14 days. In contrast, no such synergism was observed after both 8 and 14 days of exposure to the BA mixture for compounds that cause non-cholestatic hepatotoxicity. Mechanisms behind the toxicity of the cholestatic compound chlorpromazine were accurately detected in both spheroid models, including intracellular BA accumulation, inhibition of ABCB11 expression and disruption of the F-actin cytoskeleton. Furthermore, the observed synergistic toxicity of chlorpromazine and BA was associated with increased oxidative stress and modulation of death receptor signalling. Combined, our results demonstrate that the hepatic spheroid models presented here can be used to detect and study compounds with cholestatic liability. PMID:27759057

  2. Clinical multiple sclerosis occurs at one end of a spectrum of CNS pathology: a modified threshold liability model leads to new ways of thinking about the cause of clinical multiple sclerosis.

    PubMed

    Haegert, David G

    2005-01-01

    Multiple sclerosis (MS) is a complex trait, the causes of which are elusive. A threshold liability model influences thinking about the causes of this disorder. According to this model, a population has a normal distribution of genetic liability to MS. In addition, a threshold exists, so that MS begins when an individual's liability exceeds the MS threshold; environmental and other causative factors may increase or decrease an individual's MS liability. It is argued here, however, that this model is misleading, as it is based on the incorrect assumption that MS is a disorder that one either has or does not have. This paper hypothesizes, instead, that patients with a diagnosis of MS share identical CNS pathology, termed MS pathology, with some individuals who have a diagnosis of possible MS and with some apparently healthy individuals, who may never have a diagnosis of MS. In order to accommodate this hypothesis, the current threshold liability model is modified as follows. (1) In addition to a normal distribution of MS liability within a population, a spectrum of MS pathology occurs in some who have a high MS liability. (2) A clinical MS threshold exists at a point on this liability distribution, where the burden and distribution of MS pathology permits a diagnosis of clinical MS. (3) Additional thresholds exist that correspond to a lower MS liability and a lesser burden of MS pathology than occur at the clinical MS threshold. This modified threshold model leads to the postulate that causes act at various time points to increase MS liability and induce MS pathology. The accumulation of MS pathology sometimes leads to a diagnosis of clinical MS. One implication of this model is that the MS pathology in clinical MS and in some with possible MS differs only in the extent but not in the type of CNS injury. Thus, it may be possible to obtain insight into the causative environmental factors that increase MS liability and induce MS pathology by focusing on patients who have clinical MS; some environmental factors that induce new lesions in patients with clinical MS may be identical to those that induce MS pathology in genetically susceptible individuals who do not have clinical MS. Identification of these causative factors has importance, as specific treatment may prevent the accumulation of MS pathology that leads to the significant CNS damage associated with clinical MS.

  3. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  4. 21 CFR 1310.12 - Exempt chemical mixtures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Exempt chemical mixtures. 1310.12 Section 1310.12... CHEMICALS AND CERTAIN MACHINES § 1310.12 Exempt chemical mixtures. (a) The chemical mixtures meeting the... importation of listed chemicals contained in the exempt chemical mixture or the civil liability for unlawful...

  5. 21 CFR 1310.12 - Exempt chemical mixtures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Exempt chemical mixtures. 1310.12 Section 1310.12... CHEMICALS AND CERTAIN MACHINES § 1310.12 Exempt chemical mixtures. (a) The chemical mixtures meeting the... importation of listed chemicals contained in the exempt chemical mixture or the civil liability for unlawful...

  6. 21 CFR 1310.12 - Exempt chemical mixtures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Exempt chemical mixtures. 1310.12 Section 1310.12... CHEMICALS AND CERTAIN MACHINES § 1310.12 Exempt chemical mixtures. (a) The chemical mixtures meeting the..., or importation of listed chemicals contained in the exempt chemical mixture or the civil liability...

  7. 21 CFR 1310.12 - Exempt chemical mixtures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Exempt chemical mixtures. 1310.12 Section 1310.12... CHEMICALS AND CERTAIN MACHINES § 1310.12 Exempt chemical mixtures. (a) The chemical mixtures meeting the..., or importation of listed chemicals contained in the exempt chemical mixture or the civil liability...

  8. 21 CFR 1310.12 - Exempt chemical mixtures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Exempt chemical mixtures. 1310.12 Section 1310.12... CHEMICALS AND CERTAIN MACHINES § 1310.12 Exempt chemical mixtures. (a) The chemical mixtures meeting the..., or importation of listed chemicals contained in the exempt chemical mixture or the civil liability...

  9. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    PubMed Central

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-01-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented. PMID:15238544

  10. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    PubMed

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  11. Extracting Spurious Latent Classes in Growth Mixture Modeling with Nonnormal Errors

    ERIC Educational Resources Information Center

    Guerra-Peña, Kiero; Steinley, Douglas

    2016-01-01

    Growth mixture modeling is generally used for two purposes: (1) to identify mixtures of normal subgroups and (2) to approximate oddly shaped distributions by a mixture of normal components. Often in applied research this methodology is applied to both of these situations indistinctly: using the same fit statistics and likelihood ratio tests. This…

  12. Flexible mixture modeling via the multivariate t distribution with the Box-Cox transformation: an alternative to the skew-t distribution

    PubMed Central

    Lo, Kenneth

    2011-01-01

    Cluster analysis is the automated search for groups of homogeneous observations in a data set. A popular modeling approach for clustering is based on finite normal mixture models, which assume that each cluster is modeled as a multivariate normal distribution. However, the normality assumption that each component is symmetric is often unrealistic. Furthermore, normal mixture models are not robust against outliers; they often require extra components for modeling outliers and/or give a poor representation of the data. To address these issues, we propose a new class of distributions, multivariate t distributions with the Box-Cox transformation, for mixture modeling. This class of distributions generalizes the normal distribution with the more heavy-tailed t distribution, and introduces skewness via the Box-Cox transformation. As a result, this provides a unified framework to simultaneously handle outlier identification and data transformation, two interrelated issues. We describe an Expectation-Maximization algorithm for parameter estimation along with transformation selection. We demonstrate the proposed methodology with three real data sets and simulation studies. Compared with a wealth of approaches including the skew-t mixture model, the proposed t mixture model with the Box-Cox transformation performs favorably in terms of accuracy in the assignment of observations, robustness against model misspecification, and selection of the number of components. PMID:22125375

  13. Flexible mixture modeling via the multivariate t distribution with the Box-Cox transformation: an alternative to the skew-t distribution.

    PubMed

    Lo, Kenneth; Gottardo, Raphael

    2012-01-01

    Cluster analysis is the automated search for groups of homogeneous observations in a data set. A popular modeling approach for clustering is based on finite normal mixture models, which assume that each cluster is modeled as a multivariate normal distribution. However, the normality assumption that each component is symmetric is often unrealistic. Furthermore, normal mixture models are not robust against outliers; they often require extra components for modeling outliers and/or give a poor representation of the data. To address these issues, we propose a new class of distributions, multivariate t distributions with the Box-Cox transformation, for mixture modeling. This class of distributions generalizes the normal distribution with the more heavy-tailed t distribution, and introduces skewness via the Box-Cox transformation. As a result, this provides a unified framework to simultaneously handle outlier identification and data transformation, two interrelated issues. We describe an Expectation-Maximization algorithm for parameter estimation along with transformation selection. We demonstrate the proposed methodology with three real data sets and simulation studies. Compared with a wealth of approaches including the skew-t mixture model, the proposed t mixture model with the Box-Cox transformation performs favorably in terms of accuracy in the assignment of observations, robustness against model misspecification, and selection of the number of components.

  14. 49 CFR 375.201 - What is my normal liability for loss and damage when I accept goods from an individual shipper?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...; CONSUMER PROTECTION REGULATIONS Before Offering Services to My Customers Liability Considerations § 375.201... Protection Obligation—In general, your liability is for the household goods that are lost, damaged, destroyed...

  15. A Skew-Normal Mixture Regression Model

    ERIC Educational Resources Information Center

    Liu, Min; Lin, Tsung-I

    2014-01-01

    A challenge associated with traditional mixture regression models (MRMs), which rest on the assumption of normally distributed errors, is determining the number of unobserved groups. Specifically, even slight deviations from normality can lead to the detection of spurious classes. The current work aims to (a) examine how sensitive the commonly…

  16. Robust Bayesian Analysis of Heavy-tailed Stochastic Volatility Models using Scale Mixtures of Normal Distributions

    PubMed Central

    Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.

    2009-01-01

    A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043

  17. Scale Mixture Models with Applications to Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Qin, Zhaohui S.; Damien, Paul; Walker, Stephen

    2003-11-01

    Scale mixtures of uniform distributions are used to model non-normal data in time series and econometrics in a Bayesian framework. Heteroscedastic and skewed data models are also tackled using scale mixture of uniform distributions.

  18. 9 CFR 203.10 - Statement with respect to insolvency; definition of current assets and current liabilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... cycle of the business, which is considered to be one year. (2) Current liabilities means obligations... consumed in the normal operating cycle of the business; (7) accounts due from employees, if collectable; (8... classifiable as current assets or the creation of other current liabilities during the one year operating cycle...

  19. Not Quite Normal: Consequences of Violating the Assumption of Normality in Regression Mixture Models

    ERIC Educational Resources Information Center

    Van Horn, M. Lee; Smith, Jessalyn; Fagan, Abigail A.; Jaki, Thomas; Feaster, Daniel J.; Masyn, Katherine; Hawkins, J. David; Howe, George

    2012-01-01

    Regression mixture models, which have only recently begun to be used in applied research, are a new approach for finding differential effects. This approach comes at the cost of the assumption that error terms are normally distributed within classes. This study uses Monte Carlo simulations to explore the effects of relatively minor violations of…

  20. 49 CFR 375.203 - What actions of an individual shipper may limit or reduce my normal liability?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PROTECTION REGULATIONS Before Offering Services to My Customers Liability Considerations § 375.203 What... includes perishable, dangerous, or hazardous articles in the shipment without your knowledge, you need not...

  1. Using partially labeled data for normal mixture identification with application to class definition

    NASA Technical Reports Server (NTRS)

    Shahshahani, Behzad M.; Landgrebe, David A.

    1992-01-01

    The problem of estimating the parameters of a normal mixture density when, in addition to the unlabeled samples, sets of partially labeled samples are available is addressed. The density of the multidimensional feature space is modeled with a normal mixture. It is assumed that the set of components of the mixture can be partitioned into several classes and that training samples are available from each class. Since for any training sample the class of origin is known but the exact component of origin within the corresponding class is unknown, the training samples as considered to be partially labeled. The EM iterative equations are derived for estimating the parameters of the normal mixture in the presence of partially labeled samples. These equations can be used to combine the supervised and nonsupervised learning processes.

  2. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows.

  3. Multivariate Bayesian analysis of Gaussian, right censored Gaussian, ordered categorical and binary traits using Gibbs sampling

    PubMed Central

    Korsgaard, Inge Riis; Lund, Mogens Sandø; Sorensen, Daniel; Gianola, Daniel; Madsen, Per; Jensen, Just

    2003-01-01

    A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories) or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed. PMID:12633531

  4. Enhanced neurocognitive functioning and positive temperament in twins discordant for bipolar disorder.

    PubMed

    Higier, Rachel G; Jimenez, Amy M; Hultman, Christina M; Borg, Jacqueline; Roman, Cristina; Kizling, Isabelle; Larsson, Henrik; Cannon, Tyrone D

    2014-11-01

    Based on evidence linking creativity and bipolar disorder, a model has been proposed whereby factors influencing liability to bipolar disorder confer certain traits with positive effects on reproductive fitness. The authors tested this model by examining key traits known to be associated with evolutionary fitness, namely, temperament and neurocognition, in individuals carrying liability for bipolar disorder. Schizophrenia probands and their co-twins were included as psychiatric controls. Twin pairs discordant for bipolar disorder and schizophrenia and control pairs were identified through the Swedish Twin Registry. The authors administered a neuropsychological test battery and temperament questionnaires to samples of bipolar probands, bipolar co-twins, schizophrenia probands, schizophrenia co-twins, and controls. Multivariate mixed-model analyses of variance were conducted to compare groups on temperament and neurocognitive scores. Bipolar co-twins showed elevated scores on a "positivity" temperament scale compared with controls and bipolar probands, while bipolar probands scored higher on a "negativity" scale compared with their co-twins and controls, who did not differ. Additionally, bipolar co-twins showed superior performance compared with controls on tests of verbal learning and fluency, while bipolar probands showed performance decrements across all neurocognitive domains. In contrast, schizophrenia co-twins showed attenuated impairments in positivity and overall neurocognitive functioning relative to their ill proband counterparts. These findings suggest that supra-normal levels of sociability and verbal functioning may be associated with liability for bipolar disorder. These effects were specific to liability for bipolar disorder and did not apply to schizophrenia. Such benefits may provide a partial explanation for the persistence of bipolar illness in the population.

  5. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  6. Investigation into the performance of different models for predicting stutter.

    PubMed

    Bright, Jo-Anne; Curran, James M; Buckleton, John S

    2013-07-01

    In this paper we have examined five possible models for the behaviour of the stutter ratio, SR. These were two log-normal models, two gamma models, and a two-component normal mixture model. A two-component normal mixture model was chosen with different behaviours of variance; at each locus SR was described with two distributions, both with the same mean. The distributions have difference variances: one for the majority of the observations and a second for the less well-behaved ones. We apply each model to a set of known single source Identifiler™, NGM SElect™ and PowerPlex(®) 21 DNA profiles to show the applicability of our findings to different data sets. SR determined from the single source profiles were compared to the calculated SR after application of the models. The model performance was tested by calculating the log-likelihoods and comparing the difference in Akaike information criterion (AIC). The two-component normal mixture model systematically outperformed all others, despite the increase in the number of parameters. This model, as well as performing well statistically, has intuitive appeal for forensic biologists and could be implemented in an expert system with a continuous method for DNA interpretation. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Mu/Kappa Opioid Interactions in Rhesus Monkeys: Implications for Analgesia and Abuse Liability

    PubMed Central

    Negus, S. Stevens; Katrina Schrode, KA; Stevenson, Glenn W.

    2008-01-01

    Mu opioid receptor agonists are clinically valuable as analgesics; however, their use is limited by high abuse liability. Kappa opioid agonists also produce antinociception, but they do not produce mu agonist-like abuse-related effects, suggesting that they may enhance the antinociceptive effects and/or attenuate the abuse-related effects of mu agonists. To evaluate this hypothesis, the present study examined interactions between the mu agonist fentanyl and the kappa agonist U69,593 in three behavioral assays in rhesus monkeys. In an assay of schedule-controlled responding, monkeys responded under a fixed-ratio 30 (FR 30) schedule of food presentation. Fentanyl and U69,593 each produced rate-decreasing effects when administered alone, and mixtures of 0.22:1, 0.65:1 and 1.96:1 U69,593/fentanyl usually produced subadditive effects. In an assay of thermal nociception, tail withdrawal latencies were measured from water heated to 50°C. Fentanyl and U69,593 each produced dose-dependent antinociception, and effects were additive for all mixtures. In an assay of drug self-administration, rhesus monkeys responded for i.v. drug injection, and both dose and FR values were manipulated. Fentanyl maintained self-administration, whereas U69,593 did not. Addition of U69,593 to fentanyl produced a proportion-dependent decrease in both rates of fentanyl self-administration and behavioral economic measures of the reinforcing efficacy of fentanyl. Taken together, these results suggest that simultaneous activation of mu and kappa receptors, either with a mixture of selective drugs or with a single drug that targets both receptors, may reduce abuse liability without reducing analgesic effects relative to selective mu agonists administered alone. PMID:18837635

  8. Discrete Velocity Models for Polyatomic Molecules Without Nonphysical Collision Invariants

    NASA Astrophysics Data System (ADS)

    Bernhoff, Niclas

    2018-05-01

    An important aspect of constructing discrete velocity models (DVMs) for the Boltzmann equation is to obtain the right number of collision invariants. Unlike for the Boltzmann equation, for DVMs there can appear extra collision invariants, so called spurious collision invariants, in plus to the physical ones. A DVM with only physical collision invariants, and hence, without spurious ones, is called normal. The construction of such normal DVMs has been studied a lot in the literature for single species, but also for binary mixtures and recently extensively for multicomponent mixtures. In this paper, we address ways of constructing normal DVMs for polyatomic molecules (here represented by that each molecule has an internal energy, to account for non-translational energies, which can change during collisions), under the assumption that the set of allowed internal energies are finite. We present general algorithms for constructing such models, but we also give concrete examples of such constructions. This approach can also be combined with similar constructions of multicomponent mixtures to obtain multicomponent mixtures with polyatomic molecules, which is also briefly outlined. Then also, chemical reactions can be added.

  9. Maximum likelihood estimation of finite mixture model for economic data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  10. Pathophysiology of Temperature Regulation.

    ERIC Educational Resources Information Center

    Mitchell, D.; Laburn, Helen P.

    1985-01-01

    Discusses: (1) measurement of body temperature; (2) normal deviations from normal body temperature; (3) temperature in the very young and the very old; (4) abnormal liability of thermoregulation; (5) hyperthermia; (6) fever; and (7) hypothermia. (JN)

  11. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    PubMed

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.

  12. Robust nonlinear system identification: Bayesian mixture of experts using the t-distribution

    NASA Astrophysics Data System (ADS)

    Baldacchino, Tara; Worden, Keith; Rowson, Jennifer

    2017-02-01

    A novel variational Bayesian mixture of experts model for robust regression of bifurcating and piece-wise continuous processes is introduced. The mixture of experts model is a powerful model which probabilistically splits the input space allowing different models to operate in the separate regions. However, current methods have no fail-safe against outliers. In this paper, a robust mixture of experts model is proposed which consists of Student-t mixture models at the gates and Student-t distributed experts, trained via Bayesian inference. The Student-t distribution has heavier tails than the Gaussian distribution, and so it is more robust to outliers, noise and non-normality in the data. Using both simulated data and real data obtained from the Z24 bridge this robust mixture of experts performs better than its Gaussian counterpart when outliers are present. In particular, it provides robustness to outliers in two forms: unbiased parameter regression models, and robustness to overfitting/complex models.

  13. ADHD and the externalizing spectrum: direct comparison of categorical, continuous, and hybrid models of liability in a nationally representative sample.

    PubMed

    Carragher, Natacha; Krueger, Robert F; Eaton, Nicholas R; Markon, Kristian E; Keyes, Katherine M; Blanco, Carlos; Saha, Tulshi D; Hasin, Deborah S

    2014-08-01

    Alcohol use disorders, substance use disorders, and antisocial personality disorder share a common externalizing liability, which may also include attention-deficit hyperactivity disorder (ADHD). However, few studies have compared formal quantitative models of externalizing liability, with the aim of delineating the categorical and/or continuous nature of this liability in the community. This study compares categorical, continuous, and hybrid models of externalizing liability. Data were derived from the 2004-2005 National Epidemiologic Survey on Alcohol and Related Conditions (N = 34,653). Seven disorders were modeled: childhood ADHD and lifetime diagnoses of antisocial personality disorder (ASPD), nicotine dependence, alcohol dependence, marijuana dependence, cocaine dependence, and other substance dependence. The continuous latent trait model provided the best fit to the data. Measurement invariance analyses supported the fit of the model across genders, with females displaying a significantly lower probability of experiencing externalizing disorders. Cocaine dependence, marijuana dependence, other substance dependence, alcohol dependence, ASPD, nicotine dependence, and ADHD provided the greatest information, respectively, about the underlying externalizing continuum. Liability to externalizing disorders is continuous and dimensional in severity. The findings have important implications for the organizational structure of externalizing psychopathology in psychiatric nomenclatures.

  14. Apparent Transition in the Human Height Distribution Caused by Age-Dependent Variation during Puberty Period

    NASA Astrophysics Data System (ADS)

    Iwata, Takaki; Yamazaki, Yoshihiro; Kuninaka, Hiroto

    2013-08-01

    In this study, we examine the validity of the transition of the human height distribution from the log-normal distribution to the normal distribution during puberty, as suggested in an earlier study [Kuninaka et al.: J. Phys. Soc. Jpn. 78 (2009) 125001]. Our data analysis reveals that, in late puberty, the variation in height decreases as children grow. Thus, the classification of a height dataset by age at this stage leads us to analyze a mixture of distributions with larger means and smaller variations. This mixture distribution has a negative skewness and is consequently closer to the normal distribution than to the log-normal distribution. The opposite case occurs in early puberty and the mixture distribution is positively skewed, which resembles the log-normal distribution rather than the normal distribution. Thus, this scenario mimics the transition during puberty. Additionally, our scenario is realized through a numerical simulation based on a statistical model. The present study does not support the transition suggested by the earlier study.

  15. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  16. Specific Immunotherapy of Experimental Myasthenia Gravis by A Novel Mechanism

    PubMed Central

    Luo, Jie; Kuryatov, Alexander; Lindstrom, Jon

    2009-01-01

    Objective Myasthenia gravis (MG) and its animal model, experimental autoimmune myasthenia gravis (EAMG), are antibody-mediated autoimmune diseases, in which autoantibodies bind to and cause loss of muscle nicotinic acetylcholine receptors (AChRs) at the neuromuscular junction. To develop a specific immunotherapy of MG, we treated rats with ongoing EAMG by intraperitoneal injection of bacterially-expressed human muscle AChR constructs. Methods Rats with ongoing EAMG received intraperitoneal treatment with the constructs weekly for 5 weeks beginning after the acute phase. Autoantibody concentration, subclassification, and specificity were analyzed to address underlying therapeutic mechanism. Results EAMG was specifically suppressed by diverting autoantibody production away from pathologically relevant specificities directed at epitopes on the extracellular surface of muscle AChRs toward pathologically irrelevant epitopes on the cytoplasmic domain. A mixture of subunit cytoplasmic domains was more effective than a mixture containing both extracellular and cytoplasmic domains or than only the extracellular domain of α1 subunits. Interpretation Therapy using only cytoplasmic domains, which lack pathologically relevant epitopes, avoids the potential liability of boosting the pathological response. Use of a mixture of bacterially-expressed human muscle AChR cytoplasmic domains for antigen-specific immunosuppression of myasthenia gravis has the potential to be specific, robust, and safe. PMID:20437579

  17. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  18. Mixture modeling methods for the assessment of normal and abnormal personality, part II: longitudinal models.

    PubMed

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.

  19. Cost vs. Risk: Determining the Correct Liability Insurance Limit.

    ERIC Educational Resources Information Center

    Klinksiek, Glenn

    1996-01-01

    Presents a model for evaluating liability insurance limits and selecting the correct limit for an individual institution. Argues that many colleges and universities may be making overly conservative decisions that lead to the purchase of too much liability insurance. Also discusses the financial consequences of an uninsured large liability loss.…

  20. Weibull mixture regression for marginal inference in zero-heavy continuous outcomes.

    PubMed

    Gebregziabher, Mulugeta; Voronca, Delia; Teklehaimanot, Abeba; Santa Ana, Elizabeth J

    2017-06-01

    Continuous outcomes with preponderance of zero values are ubiquitous in data that arise from biomedical studies, for example studies of addictive disorders. This is known to lead to violation of standard assumptions in parametric inference and enhances the risk of misleading conclusions unless managed properly. Two-part models are commonly used to deal with this problem. However, standard two-part models have limitations with respect to obtaining parameter estimates that have marginal interpretation of covariate effects which are important in many biomedical applications. Recently marginalized two-part models are proposed but their development is limited to log-normal and log-skew-normal distributions. Thus, in this paper, we propose a finite mixture approach, with Weibull mixture regression as a special case, to deal with the problem. We use extensive simulation study to assess the performance of the proposed model in finite samples and to make comparisons with other family of models via statistical information and mean squared error criteria. We demonstrate its application on real data from a randomized controlled trial of addictive disorders. Our results show that a two-component Weibull mixture model is preferred for modeling zero-heavy continuous data when the non-zero part are simulated from Weibull or similar distributions such as Gamma or truncated Gauss.

  1. [Clinical practice guidelines: juridical and medico legal issues in health care malpractice liability].

    PubMed

    Moreschi, Carlo; Broi, Ugo Da

    2014-01-01

    Clinical Practice Guidelines are clinical tools addressed to medical and health professionals and are normally employed to improve quality and safety of diagnostic and therapeutical procedures but may sometimes limit the autonomy of medical and other health care professionals. The adherence to Clinical Practice Guidelines should not be an exclusive step to evaluate the liability and respect of standards of care in case of medico-legal investigations being each clinical case very specific. Medical liability and respect of standards of care should be evaluated with the support of Clinical Practice Guidelines and the extensive examination of all specific features, professional background and experience requested to treat each single patient.

  2. Bayesian Regularization for Normal Mixture Estimation and Model-Based Clustering

    DTIC Science & Technology

    2005-08-04

    describe a four-band magnetic resonance image (MRI) consisting of 23,712 pixels of a brain with a tumor 2. Because of the size of the dataset, it is not...the Royal Statistical Society, Series B 56, 363–375. Figueiredo, M. A. T. and A. K. Jain (2002). Unsupervised learning of finite mixture models. IEEE...20 5.4 Brain MRI

  3. PLEMT: A NOVEL PSEUDOLIKELIHOOD BASED EM TEST FOR HOMOGENEITY IN GENERALIZED EXPONENTIAL TILT MIXTURE MODELS.

    PubMed

    Hong, Chuan; Chen, Yong; Ning, Yang; Wang, Shuang; Wu, Hao; Carroll, Raymond J

    2017-01-01

    Motivated by analyses of DNA methylation data, we propose a semiparametric mixture model, namely the generalized exponential tilt mixture model, to account for heterogeneity between differentially methylated and non-differentially methylated subjects in the cancer group, and capture the differences in higher order moments (e.g. mean and variance) between subjects in cancer and normal groups. A pairwise pseudolikelihood is constructed to eliminate the unknown nuisance function. To circumvent boundary and non-identifiability problems as in parametric mixture models, we modify the pseudolikelihood by adding a penalty function. In addition, the test with simple asymptotic distribution has computational advantages compared with permutation-based test for high-dimensional genetic or epigenetic data. We propose a pseudolikelihood based expectation-maximization test, and show the proposed test follows a simple chi-squared limiting distribution. Simulation studies show that the proposed test controls Type I errors well and has better power compared to several current tests. In particular, the proposed test outperforms the commonly used tests under all simulation settings considered, especially when there are variance differences between two groups. The proposed test is applied to a real data set to identify differentially methylated sites between ovarian cancer subjects and normal subjects.

  4. Development of reversible jump Markov Chain Monte Carlo algorithm in the Bayesian mixture modeling for microarray data in Indonesia

    NASA Astrophysics Data System (ADS)

    Astuti, Ani Budi; Iriawan, Nur; Irhamah, Kuswanto, Heri

    2017-12-01

    In the Bayesian mixture modeling requires stages the identification number of the most appropriate mixture components thus obtained mixture models fit the data through data driven concept. Reversible Jump Markov Chain Monte Carlo (RJMCMC) is a combination of the reversible jump (RJ) concept and the Markov Chain Monte Carlo (MCMC) concept used by some researchers to solve the problem of identifying the number of mixture components which are not known with certainty number. In its application, RJMCMC using the concept of the birth/death and the split-merge with six types of movement, that are w updating, θ updating, z updating, hyperparameter β updating, split-merge for components and birth/death from blank components. The development of the RJMCMC algorithm needs to be done according to the observed case. The purpose of this study is to know the performance of RJMCMC algorithm development in identifying the number of mixture components which are not known with certainty number in the Bayesian mixture modeling for microarray data in Indonesia. The results of this study represent that the concept RJMCMC algorithm development able to properly identify the number of mixture components in the Bayesian normal mixture model wherein the component mixture in the case of microarray data in Indonesia is not known for certain number.

  5. Toward enhancing estimates of Kentucky's heavy truck tax liabilities.

    DOT National Transportation Integrated Search

    2002-08-01

    The focus of this report is the effectiveness and reliability of the current models employed to calculate the weight-distance tax and fuel surtax liabilities. This report examines the current methodology utilized to estimate potential tax liabilities...

  6. heterogeneous mixture distributions for multi-source extreme rainfall

    NASA Astrophysics Data System (ADS)

    Ouarda, T.; Shin, J.; Lee, T. S.

    2013-12-01

    Mixture distributions have been used to model hydro-meteorological variables showing mixture distributional characteristics, e.g. bimodality. Homogeneous mixture (HOM) distributions (e.g. Normal-Normal and Gumbel-Gumbel) have been traditionally applied to hydro-meteorological variables. However, there is no reason to restrict the mixture distribution as the combination of one identical type. It might be beneficial to characterize the statistical behavior of hydro-meteorological variables from the application of heterogeneous mixture (HTM) distributions such as Normal-Gamma. In the present work, we focus on assessing the suitability of HTM distributions for the frequency analysis of hydro-meteorological variables. In the present work, in order to estimate the parameters of HTM distributions, the meta-heuristic algorithm (Genetic Algorithm) is employed to maximize the likelihood function. In the present study, a number of distributions are compared, including the Gamma-Extreme value type-one (EV1) HTM distribution, the EV1-EV1 HOM distribution, and EV1 distribution. The proposed distribution models are applied to the annual maximum precipitation data in South Korea. The Akaike Information Criterion (AIC), the root mean squared errors (RMSE) and the log-likelihood are used as measures of goodness-of-fit of the tested distributions. Results indicate that the HTM distribution (Gamma-EV1) presents the best fitness. The HTM distribution shows significant improvement in the estimation of quantiles corresponding to the 20-year return period. It is shown that extreme rainfall in the coastal region of South Korea presents strong heterogeneous mixture distributional characteristics. Results indicate that HTM distributions are a good alternative for the frequency analysis of hydro-meteorological variables when disparate statistical characteristics are presented.

  7. Mixture modelling for cluster analysis.

    PubMed

    McLachlan, G J; Chang, S U

    2004-10-01

    Cluster analysis via a finite mixture model approach is considered. With this approach to clustering, the data can be partitioned into a specified number of clusters g by first fitting a mixture model with g components. An outright clustering of the data is then obtained by assigning an observation to the component to which it has the highest estimated posterior probability of belonging; that is, the ith cluster consists of those observations assigned to the ith component (i = 1,..., g). The focus is on the use of mixtures of normal components for the cluster analysis of data that can be regarded as being continuous. But attention is also given to the case of mixed data, where the observations consist of both continuous and discrete variables.

  8. 30 CFR 800.23 - Self-bonding.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... normal operating cycle of the business. Current liabilities means obligations which are reasonably expected to be paid or liquidated within one year or within the normal operating cycle of the business... applicant has been in continuous operation as a business entity for a period of not less than 5 years...

  9. On an interface of the online system for a stochastic analysis of the varied information flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorshenin, Andrey K.; MIREA, MGUPI; Kuzmin, Victor Yu.

    The article describes a possible approach to the construction of an interface of an online asynchronous system that allows researchers to analyse varied information flows. The implemented stochastic methods are based on the mixture models and the method of moving separation of mixtures. The general ideas of the system functionality are demonstrated on an example for some moments of a finite normal mixture.

  10. Systemic Losses Due to Counterparty Risk in a Stylized Banking System

    NASA Astrophysics Data System (ADS)

    Birch, Annika; Aste, Tomaso

    2014-09-01

    We report a study of a stylized banking cascade model investigating systemic risk caused by counterparty failure using liabilities and assets to define banks' balance sheet. In our stylized system, banks can be in two states: normally operating or distressed and the state of a bank changes from normally operating to distressed whenever its liabilities are larger than the banks' assets. The banks are connected through an interbank lending network and, whenever a bank is distressed, its creditor cannot expect the loan from the distressed bank to be repaid, potentially becoming distressed themselves. We solve the problem analytically for a homogeneous system and test the robustness and generality of the results with simulations of more complex systems. We investigate the parameter space and the corresponding distribution of operating banks mapping the conditions under which the whole system is stable or unstable. This allows us to determine how financial stability of a banking system is influenced by regulatory decisions, such as leverage; we discuss the effect of central bank actions, such as quantitative easing and we determine the cost of rescuing a distressed banking system using re-capitalisation. Finally, we estimate the stability of the UK and US banking systems comparing the years 2007 and 2012 by using real data.

  11. Assessing variation in life-history tactics within a population using mixture regression models: a practical guide for evolutionary ecologists.

    PubMed

    Hamel, Sandra; Yoccoz, Nigel G; Gaillard, Jean-Michel

    2017-05-01

    Mixed models are now well-established methods in ecology and evolution because they allow accounting for and quantifying within- and between-individual variation. However, the required normal distribution of the random effects can often be violated by the presence of clusters among subjects, which leads to multi-modal distributions. In such cases, using what is known as mixture regression models might offer a more appropriate approach. These models are widely used in psychology, sociology, and medicine to describe the diversity of trajectories occurring within a population over time (e.g. psychological development, growth). In ecology and evolution, however, these models are seldom used even though understanding changes in individual trajectories is an active area of research in life-history studies. Our aim is to demonstrate the value of using mixture models to describe variation in individual life-history tactics within a population, and hence to promote the use of these models by ecologists and evolutionary ecologists. We first ran a set of simulations to determine whether and when a mixture model allows teasing apart latent clustering, and to contrast the precision and accuracy of estimates obtained from mixture models versus mixed models under a wide range of ecological contexts. We then used empirical data from long-term studies of large mammals to illustrate the potential of using mixture models for assessing within-population variation in life-history tactics. Mixture models performed well in most cases, except for variables following a Bernoulli distribution and when sample size was small. The four selection criteria we evaluated [Akaike information criterion (AIC), Bayesian information criterion (BIC), and two bootstrap methods] performed similarly well, selecting the right number of clusters in most ecological situations. We then showed that the normality of random effects implicitly assumed by evolutionary ecologists when using mixed models was often violated in life-history data. Mixed models were quite robust to this violation in the sense that fixed effects were unbiased at the population level. However, fixed effects at the cluster level and random effects were better estimated using mixture models. Our empirical analyses demonstrated that using mixture models facilitates the identification of the diversity of growth and reproductive tactics occurring within a population. Therefore, using this modelling framework allows testing for the presence of clusters and, when clusters occur, provides reliable estimates of fixed and random effects for each cluster of the population. In the presence or expectation of clusters, using mixture models offers a suitable extension of mixed models, particularly when evolutionary ecologists aim at identifying how ecological and evolutionary processes change within a population. Mixture regression models therefore provide a valuable addition to the statistical toolbox of evolutionary ecologists. As these models are complex and have their own limitations, we provide recommendations to guide future users. © 2016 Cambridge Philosophical Society.

  12. Reinterpreting Comorbidity: A Model-Based Approach to Understanding and Classifying Psychopathology

    PubMed Central

    Krueger, Robert F.; Markon, Kristian E.

    2008-01-01

    Comorbidity has presented a persistent puzzle for psychopathology research. We review recent literature indicating that the puzzle of comorbidity is being solved by research fitting explicit quantitative models to data on comorbidity. We present a meta-analysis of a liability spectrum model of comorbidity, in which specific mental disorders are understood as manifestations of latent liability factors that explain comorbidity by virtue of their impact on multiple disorders. Nosological, structural, etiological, and psychological aspects of this liability spectrum approach to understanding comorbidity are discussed. PMID:17716066

  13. Atmospheric Photochemical Modeling of Turbine Engine Fuels and Exhausts. Phase 2. Computer Model Development. Volume 2

    DTIC Science & Technology

    1988-05-01

    represented name Emitted Organics Included in All Models CO Carbon Monoxide C:C, Ethene HCHO Formaldehyde CCHO Acetaldehyde RCHO Propionaldehyde and other...of species in the mixture, and for proper use of this program, these files should be "normalized," i.e., the number of carbons in the mixture should...scenario in memory. Valid parmtypes are SCEN, PHYS, CHEM, VP, NSP, OUTP, SCHEDS. LIST ALLCOMP Lists all available composition filenames. LIST ALLSCE

  14. Differential models of twin correlations in skew for body-mass index (BMI).

    PubMed

    Tsang, Siny; Duncan, Glen E; Dinescu, Diana; Turkheimer, Eric

    2018-01-01

    Body Mass Index (BMI), like most human phenotypes, is substantially heritable. However, BMI is not normally distributed; the skew appears to be structural, and increases as a function of age. Moreover, twin correlations for BMI commonly violate the assumptions of the most common variety of the classical twin model, with the MZ twin correlation greater than twice the DZ correlation. This study aimed to decompose twin correlations for BMI using more general skew-t distributions. Same sex MZ and DZ twin pairs (N = 7,086) from the community-based Washington State Twin Registry were included. We used latent profile analysis (LPA) to decompose twin correlations for BMI into multiple mixture distributions. LPA was performed using the default normal mixture distribution and the skew-t mixture distribution. Similar analyses were performed for height as a comparison. Our analyses are then replicated in an independent dataset. A two-class solution under the skew-t mixture distribution fits the BMI distribution for both genders. The first class consists of a relatively normally distributed, highly heritable BMI with a mean in the normal range. The second class is a positively skewed BMI in the overweight and obese range, with lower twin correlations. In contrast, height is normally distributed, highly heritable, and is well-fit by a single latent class. Results in the replication dataset were highly similar. Our findings suggest that two distinct processes underlie the skew of the BMI distribution. The contrast between height and weight is in accord with subjective psychological experience: both are under obvious genetic influence, but BMI is also subject to behavioral control, whereas height is not.

  15. Strelka: accurate somatic small-variant calling from sequenced tumor-normal sample pairs.

    PubMed

    Saunders, Christopher T; Wong, Wendy S W; Swamy, Sajani; Becq, Jennifer; Murray, Lisa J; Cheetham, R Keira

    2012-07-15

    Whole genome and exome sequencing of matched tumor-normal sample pairs is becoming routine in cancer research. The consequent increased demand for somatic variant analysis of paired samples requires methods specialized to model this problem so as to sensitively call variants at any practical level of tumor impurity. We describe Strelka, a method for somatic SNV and small indel detection from sequencing data of matched tumor-normal samples. The method uses a novel Bayesian approach which represents continuous allele frequencies for both tumor and normal samples, while leveraging the expected genotype structure of the normal. This is achieved by representing the normal sample as a mixture of germline variation with noise, and representing the tumor sample as a mixture of the normal sample with somatic variation. A natural consequence of the model structure is that sensitivity can be maintained at high tumor impurity without requiring purity estimates. We demonstrate that the method has superior accuracy and sensitivity on impure samples compared with approaches based on either diploid genotype likelihoods or general allele-frequency tests. The Strelka workflow source code is available at ftp://strelka@ftp.illumina.com/. csaunders@illumina.com

  16. Mixture models for estimating the size of a closed population when capture rates vary among individuals

    USGS Publications Warehouse

    Dorazio, R.M.; Royle, J. Andrew

    2003-01-01

    We develop a parameterization of the beta-binomial mixture that provides sensible inferences about the size of a closed population when probabilities of capture or detection vary among individuals. Three classes of mixture models (beta-binomial, logistic-normal, and latent-class) are fitted to recaptures of snowshoe hares for estimating abundance and to counts of bird species for estimating species richness. In both sets of data, rates of detection appear to vary more among individuals (animals or species) than among sampling occasions or locations. The estimates of population size and species richness are sensitive to model-specific assumptions about the latent distribution of individual rates of detection. We demonstrate using simulation experiments that conventional diagnostics for assessing model adequacy, such as deviance, cannot be relied on for selecting classes of mixture models that produce valid inferences about population size. Prior knowledge about sources of individual heterogeneity in detection rates, if available, should be used to help select among classes of mixture models that are to be used for inference.

  17. An a priori DNS study of the shadow-position mixing model

    DOE PAGES

    Zhao, Xin -Yu; Bhagatwala, Ankit; Chen, Jacqueline H.; ...

    2016-01-15

    In this study, the modeling of mixing by molecular diffusion is a central aspect for transported probability density function (tPDF) methods. In this paper, the newly-proposed shadow position mixing model (SPMM) is examined, using a DNS database for a temporally evolving di-methyl ether slot jet flame. Two methods that invoke different levels of approximation are proposed to extract the shadow displacement (equivalent to shadow position) from the DNS database. An approach for a priori analysis of the mixing-model performance is developed. The shadow displacement is highly correlated with both mixture fraction and velocity, and the peak correlation coefficient of themore » shadow displacement and mixture fraction is higher than that of the shadow displacement and velocity. This suggests that the composition-space localness is reasonably well enforced by the model, with appropriate choices of model constants. The conditional diffusion of mixture fraction and major species from DNS and from SPMM are then compared, using mixing rates that are derived by matching the mixture fraction scalar dissipation rates. Good qualitative agreement is found, for the prediction of the locations of zero and maximum/minimum conditional diffusion locations for mixture fraction and individual species. Similar comparisons are performed for DNS and the IECM (interaction by exchange with the conditional mean) model. The agreement between SPMM and DNS is better than that between IECM and DNS, in terms of conditional diffusion iso-contour similarities and global normalized residual levels. It is found that a suitable value for the model constant c that controls the mixing frequency can be derived using the local normalized scalar variance, and that the model constant a controls the localness of the model. A higher-Reynolds-number test case is anticipated to be more appropriate to evaluate the mixing models, and stand-alone transported PDF simulations are required to more fully enforce localness and to assess model performance.« less

  18. A nonlinear isobologram model with Box-Cox transformation to both sides for chemical mixtures.

    PubMed

    Chen, D G; Pounds, J G

    1998-12-01

    The linear logistical isobologram is a commonly used and powerful graphical and statistical tool for analyzing the combined effects of simple chemical mixtures. In this paper a nonlinear isobologram model is proposed to analyze the joint action of chemical mixtures for quantitative dose-response relationships. This nonlinear isobologram model incorporates two additional new parameters, Ymin and Ymax, to facilitate analysis of response data that are not constrained between 0 and 1, where parameters Ymin and Ymax represent the minimal and the maximal observed toxic response. This nonlinear isobologram model for binary mixtures can be expressed as [formula: see text] In addition, a Box-Cox transformation to both sides is introduced to improve the goodness of fit and to provide a more robust model for achieving homogeneity and normality of the residuals. Finally, a confidence band is proposed for selected isobols, e.g., the median effective dose, to facilitate graphical and statistical analysis of the isobologram. The versatility of this approach is demonstrated using published data describing the toxicity of the binary mixtures of citrinin and ochratoxin as well as a new experimental data from our laboratory for mixtures of mercury and cadmium.

  19. A nonlinear isobologram model with Box-Cox transformation to both sides for chemical mixtures.

    PubMed Central

    Chen, D G; Pounds, J G

    1998-01-01

    The linear logistical isobologram is a commonly used and powerful graphical and statistical tool for analyzing the combined effects of simple chemical mixtures. In this paper a nonlinear isobologram model is proposed to analyze the joint action of chemical mixtures for quantitative dose-response relationships. This nonlinear isobologram model incorporates two additional new parameters, Ymin and Ymax, to facilitate analysis of response data that are not constrained between 0 and 1, where parameters Ymin and Ymax represent the minimal and the maximal observed toxic response. This nonlinear isobologram model for binary mixtures can be expressed as [formula: see text] In addition, a Box-Cox transformation to both sides is introduced to improve the goodness of fit and to provide a more robust model for achieving homogeneity and normality of the residuals. Finally, a confidence band is proposed for selected isobols, e.g., the median effective dose, to facilitate graphical and statistical analysis of the isobologram. The versatility of this approach is demonstrated using published data describing the toxicity of the binary mixtures of citrinin and ochratoxin as well as a new experimental data from our laboratory for mixtures of mercury and cadmium. PMID:9860894

  20. 18 CFR 154.305 - Tax normalization.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... State (including franchise taxes). (4) Income tax component means that part of the cost-of-service that... deferred taxes becomes deficient in, or in excess of, amounts necessary to meet future tax liabilities. (2...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Xin -Yu; Bhagatwala, Ankit; Chen, Jacqueline H.

    In this study, the modeling of mixing by molecular diffusion is a central aspect for transported probability density function (tPDF) methods. In this paper, the newly-proposed shadow position mixing model (SPMM) is examined, using a DNS database for a temporally evolving di-methyl ether slot jet flame. Two methods that invoke different levels of approximation are proposed to extract the shadow displacement (equivalent to shadow position) from the DNS database. An approach for a priori analysis of the mixing-model performance is developed. The shadow displacement is highly correlated with both mixture fraction and velocity, and the peak correlation coefficient of themore » shadow displacement and mixture fraction is higher than that of the shadow displacement and velocity. This suggests that the composition-space localness is reasonably well enforced by the model, with appropriate choices of model constants. The conditional diffusion of mixture fraction and major species from DNS and from SPMM are then compared, using mixing rates that are derived by matching the mixture fraction scalar dissipation rates. Good qualitative agreement is found, for the prediction of the locations of zero and maximum/minimum conditional diffusion locations for mixture fraction and individual species. Similar comparisons are performed for DNS and the IECM (interaction by exchange with the conditional mean) model. The agreement between SPMM and DNS is better than that between IECM and DNS, in terms of conditional diffusion iso-contour similarities and global normalized residual levels. It is found that a suitable value for the model constant c that controls the mixing frequency can be derived using the local normalized scalar variance, and that the model constant a controls the localness of the model. A higher-Reynolds-number test case is anticipated to be more appropriate to evaluate the mixing models, and stand-alone transported PDF simulations are required to more fully enforce localness and to assess model performance.« less

  2. The Use of Growth Mixture Modeling for Studying Resilience to Major Life Stressors in Adulthood and Old Age: Lessons for Class Size and Identification and Model Selection.

    PubMed

    Infurna, Frank J; Grimm, Kevin J

    2017-12-15

    Growth mixture modeling (GMM) combines latent growth curve and mixture modeling approaches and is typically used to identify discrete trajectories following major life stressors (MLS). However, GMM is often applied to data that does not meet the statistical assumptions of the model (e.g., within-class normality) and researchers often do not test additional model constraints (e.g., homogeneity of variance across classes), which can lead to incorrect conclusions regarding the number and nature of the trajectories. We evaluate how these methodological assumptions influence trajectory size and identification in the study of resilience to MLS. We use data on changes in subjective well-being and depressive symptoms following spousal loss from the HILDA and HRS. Findings drastically differ when constraining the variances to be homogenous versus heterogeneous across trajectories, with overextraction being more common when constraining the variances to be homogeneous across trajectories. In instances, when the data are non-normally distributed, assuming normally distributed data increases the extraction of latent classes. Our findings showcase that the assumptions typically underlying GMM are not tenable, influencing trajectory size and identification and most importantly, misinforming conceptual models of resilience. The discussion focuses on how GMM can be leveraged to effectively examine trajectories of adaptation following MLS and avenues for future research. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Models of comorbidity for multifactorial disorders.

    PubMed Central

    Neale, M C; Kendler, K S

    1995-01-01

    We develop several formal models for comorbidity between multifactorial disorders. Based on the work of D. N. Klein and L. P. Riso, the models include (i) alternate forms, where the two disorders have the same underlying continuum of liability; (ii) random multiformity, in which affection status on one disorder abruptly increases risk for the second; (iii) extreme multiformity, where only extreme cases have an abruptly increased risk for the second disorder; (iv) three independent disorders, in which excess comorbid cases are due to a separate, third disorder; (v) correlated liabilities, where the risk factors for the two disorders correlate; and (vi) direct causal models, where the liability for one disorder is a cause of the other disorder. These models are used to make quantitative predictions about the relative proportions of pairs of relatives who are classified according to whether each relative has neither disorder, disorder A but not B, disorder B but not A, or both A and B. For illustration, we analyze data on major depression (MD) and generalized anxiety disorder (GAD) assessed in adult female MZ and DZ twins, which enable estimation of the relative impact of genetic and environmental factors. Several models are rejected--that comorbid cases are due to chance; multiformity of GAD; a third independent disorder; and GAD being a cause of MD. Of the models that fit the data, correlated liabilities, MD causes GAD, and reciprocal causation seem best. MD appears to be a source of liability for GAD. Possible extensions to the models are discussed. PMID:7573055

  4. Infinite von Mises-Fisher Mixture Modeling of Whole Brain fMRI Data.

    PubMed

    Røge, Rasmus E; Madsen, Kristoffer H; Schmidt, Mikkel N; Mørup, Morten

    2017-10-01

    Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises-Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Monte Carlo sampling. Comparing the vMF and gaussian mixture models on synthetic data, we demonstrate that the vMF model has a slight advantage inferring the true underlying clustering when compared to gaussian-based models on data generated from both a mixture of vMFs and a mixture of gaussians subsequently normalized. Thus, when performing model selection, the two models are not in agreement. Analyzing multisubject whole brain resting-state fMRI data from healthy adult subjects, we find that the vMF mixture model is considerably more reliable than the gaussian mixture model when comparing solutions across models trained on different groups of subjects, and again we find that the two models disagree on the optimal number of components. The analysis indicates that the fMRI data support more than a thousand clusters, and we confirm this is not a result of overfitting by demonstrating better prediction on data from held-out subjects. Our results highlight the utility of using directional statistics to model standardized fMRI data and demonstrate that whole brain segmentation of fMRI data requires a very large number of functional units in order to adequately account for the discernible statistical patterns in the data.

  5. Exponential model normalization for electrical capacitance tomography with external electrodes under gap permittivity conditions

    NASA Astrophysics Data System (ADS)

    Baidillah, Marlin R.; Takei, Masahiro

    2017-06-01

    A nonlinear normalization model which is called exponential model for electrical capacitance tomography (ECT) with external electrodes under gap permittivity conditions has been developed. The exponential model normalization is proposed based on the inherently nonlinear relationship characteristic between the mixture permittivity and the measured capacitance due to the gap permittivity of inner wall. The parameters of exponential equation are derived by using an exponential fitting curve based on the simulation and a scaling function is added to adjust the experiment system condition. The exponential model normalization was applied to two dimensional low and high contrast dielectric distribution phantoms by using simulation and experimental studies. The proposed normalization model has been compared with other normalization models i.e. Parallel, Series, Maxwell and Böttcher models. Based on the comparison of image reconstruction results, the exponential model is reliable to predict the nonlinear normalization of measured capacitance in term of low and high contrast dielectric distribution.

  6. A simple implementation of a normal mixture approach to differential gene expression in multiclass microarrays.

    PubMed

    McLachlan, G J; Bean, R W; Jones, L Ben-Tovim

    2006-07-01

    An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.

  7. Mixture EMOS model for calibrating ensemble forecasts of wind speed.

    PubMed

    Baran, S; Lerch, S

    2016-03-01

    Ensemble model output statistics (EMOS) is a statistical tool for post-processing forecast ensembles of weather variables obtained from multiple runs of numerical weather prediction models in order to produce calibrated predictive probability density functions. The EMOS predictive probability density function is given by a parametric distribution with parameters depending on the ensemble forecasts. We propose an EMOS model for calibrating wind speed forecasts based on weighted mixtures of truncated normal (TN) and log-normal (LN) distributions where model parameters and component weights are estimated by optimizing the values of proper scoring rules over a rolling training period. The new model is tested on wind speed forecasts of the 50 member European Centre for Medium-range Weather Forecasts ensemble, the 11 member Aire Limitée Adaptation dynamique Développement International-Hungary Ensemble Prediction System ensemble of the Hungarian Meteorological Service, and the eight-member University of Washington mesoscale ensemble, and its predictive performance is compared with that of various benchmark EMOS models based on single parametric families and combinations thereof. The results indicate improved calibration of probabilistic and accuracy of point forecasts in comparison with the raw ensemble and climatological forecasts. The mixture EMOS model significantly outperforms the TN and LN EMOS methods; moreover, it provides better calibrated forecasts than the TN-LN combination model and offers an increased flexibility while avoiding covariate selection problems. © 2016 The Authors Environmetrics Published by JohnWiley & Sons Ltd.

  8. NTP technical report on the toxicity studies of Pesticide/Fertilizer Mixtures Administered in Drinking Water to F344/N Rats and B6C3F1 Mice.

    PubMed

    Yang, R.

    1993-08-01

    Toxicity studies were performed with pesticide and fertilizer mixtures representative of groundwater contamination found in California and Iowa. The California mixture was composed of aldicarb, atrazine, 1,2-dibromo-3-chloropropane, 1,2- dichloropropane, ethylene dibromide, simazine, and ammonium nitrate. The Iowa mixture contained alachlor, atrazine, cyanazine, metolachlor, metribuzin, and ammonium nitrate. The mixtures were administered in drinking water (with 512 ppm propylene glycol) to F344/N rats and B6C3F1 mice of each sex at concentrations ranging from 0.1x to 100x, where 1x represented the median concentrations of the individual chemicals found in studies of groundwater contamination from normal agricultural activities. This report focuses primarily on 26-week toxicity studies describing histopathology, clinical pathology, neurobehavior/neuropathology, and reproductive system effects. The genetic toxicity of the mixtures was assessed by determining the frequency of micronuclei in peripheral blood of mice and evaluating micronuclei and sister chromatid exchanges in splenocytes from female mice and male rats. Additional studies with these mixtures that are briefly reviewed in this report include teratology studies with Sprague-Dawley rats and continuous breeding studies with CD-1 Swiss mice. In 26-week drinking water studies of the California and the Iowa mixtures, all rats (10 per sex and group) survived to the end of the studies, and there were no significant effects on body weight gains. Water consumption was not affected by the pesticide/fertilizer contaminants, and there were no clinical signs of toxicity or neurobehavioral effects as measured by a functional observational battery, motor activity evaluations, thermal sensitivity evaluations, and startle response. There were no clear adverse effects noted in clinical pathology (including serum cholinesterase activity), organ weight, reproductive system, or histopathologic evaluations, although absolute and relative liver weights were marginally increased with increasing exposure concentration in both male and female rats consuming the Iowa mixture. In 26-week drinking water studies in mice, one male receiving the California mixture at 100x died during the study, and one control female and one female in the 100x group in the Iowa mixture study also died early. It could not be determined if the death of either of the mice in the 100x groups was related to consumption of the pesticide/fertilizer mixtures. Water consumption and body weight gains were not affected in these studies, and no signs of toxicity were noted in clinical observations or in neurobehavioral assessments. No clear adverse effects were noted in clinical pathology, reproductive system, organ weight, or histopathologic evaluations of exposed mice. The pesticide/fertilizer mixtures, when tested over a concentration range similar to that used in the 26-week studies, were found to have no effects in teratology studies or in a continuous breeding assay examining reproductive and developmental toxicity. The California and Iowa pesticide mixtures were tested for induction of micronuclei in peripheral blood erythrocytes of female mice. Results of tests with the California mixture were negative. Significant increases in micronucleated normochromatic erythrocytes were seen at the two-highest concentrations (10x and 100x) of the Iowa mixture, but the increases were within the normal range of micronuclei in historical control animals. Splenocytes of male rats and female mice exposed to these mixtures were examined for micronucleus and sister chromatid exchange frequencies. Sister chromatid exchange frequencies were marginally increased in rats and mice receiving the California mixture, but neither species exhibited increased frequencies of micronucleated splenocytes. None of these changes were considered to have biological importance. In summary, studies of potential toxicity associated with the consumption of mixtures of pesticides and a fertilizer representative of groundwater contamination in agriculturative of groundwater contamination in agricultural areas of Iowa and California failed to demonstrate any significant adverse effects in rats or mice receiving the mixtures in drinking water at concentrations as high as 100 times the median concentrations of the individual chemicals determined by groundwater surveys. NOTE: These studies were supported in part by funds from the Comprehensive Environmental Response, Compensation, and Liability Act trust fund (Superfund) by an interagency agreement with the Agency for Toxic Substances and Disease Registry, U.S. Public Health Service.

  9. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  10. Quantiles for Finite Mixtures of Normal Distributions

    ERIC Educational Resources Information Center

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  11. Two-component mixture model: Application to palm oil and exchange rate

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir; Hamzah, Firdaus Mohamad

    2014-12-01

    Palm oil is a seed crop which is widely adopt for food and non-food products such as cookie, vegetable oil, cosmetics, household products and others. Palm oil is majority growth in Malaysia and Indonesia. However, the demand for palm oil is getting growth and rapidly running out over the years. This phenomenal cause illegal logging of trees and destroy the natural habitat. Hence, the present paper investigates the relationship between exchange rate and palm oil price in Malaysia by using Maximum Likelihood Estimation via Newton-Raphson algorithm to fit a two components mixture model. Besides, this paper proposes a mixture of normal distribution to accommodate with asymmetry characteristics and platykurtic time series data.

  12. 38 CFR 36.4225 - Authority to close manufactured home loans on the automatic basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accounting or business cycle. Current liabilities are defined as obligations that would be paid within a year on a normal accounting or business cycle. The lender's latest financial statements (profit and loss...

  13. 26 CFR 1.901-2 - Income, war profits, or excess profits tax paid or accrued.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... section, the foreign tax is likely to reach net gain in the normal circumstances in which it applies, (ii... income tax liability to another country. (b) Net gain—(1) In general. A foreign tax is likely to reach net gain in the normal circumstances in which it applies if and only if the tax, judged on the basis...

  14. 26 CFR 1.901-2 - Income, war profits, or excess profits tax paid or accrued.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... section, the foreign tax is likely to reach net gain in the normal circumstances in which it applies, (ii... income tax liability to another country. (b) Net gain—(1) In general. A foreign tax is likely to reach net gain in the normal circumstances in which it applies if and only if the tax, judged on the basis...

  15. Latent Partially Ordered Classification Models and Normal Mixtures

    ERIC Educational Resources Information Center

    Tatsuoka, Curtis; Varadi, Ferenc; Jaeger, Judith

    2013-01-01

    Latent partially ordered sets (posets) can be employed in modeling cognitive functioning, such as in the analysis of neuropsychological (NP) and educational test data. Posets are cognitively diagnostic in the sense that classification states in these models are associated with detailed profiles of cognitive functioning. These profiles allow for…

  16. Finding Groups Using Model-Based Cluster Analysis: Heterogeneous Emotional Self-Regulatory Processes and Heavy Alcohol Use Risk

    ERIC Educational Resources Information Center

    Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.

    2008-01-01

    Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…

  17. Abuse liability assessment of an e-cigarette refill liquid using intracranial self-stimulation and self-administration models in rats

    PubMed Central

    LeSage, MG; Staley, M; Muelken, P; Smethells, JR; Stepanov, I; Vogel, RI; Pentel, PR; Harris, AC

    2016-01-01

    Background The popularity of electronic cigarettes (ECs) has increased dramatically despite their unknown health consequences. Because the abuse liability of ECs is one of the leading concerns of the Food and Drug Administration (FDA), models to assess it are urgently needed to inform FDA regulatory decisions regarding these products. The purpose of this study was to assess the relative abuse liability of an EC liquid compared to nicotine alone in rats. Because this EC liquid contains non-nicotine constituents that may enhance its abuse liability, we hypothesized that it would have greater abuse liability than nicotine alone. Methods Nicotine alone and nicotine dose-equivalent concentrations of EC liquid were compared in terms of their acute effects on intracranial self-stimulation (ICSS) thresholds, acquisition of self-administration, reinforcing efficacy (i.e., elasticity of demand), blockade of these behavioral effects by mecamylamine, nicotine pharmacokinetics and nicotinic acetylcholine receptor binding and activation. Results There were no significant differences between formulations on any measure, except that EC liquid produced less of an elevation in ICSS thresholds at high nicotine doses. Conclusions Collectively, these findings suggest that the relative abuse liability of this EC liquid is similar to that of nicotine alone in terms of its reinforcing and reinforcement-enhancing effects, but that it may have less aversive/anhedonic effects at high doses. The present methods may be useful for assessing the abuse liability of other ECs to inform potential FDA regulation of those products. PMID:27627814

  18. Disclosure, Apology, and Offer Programs: Stakeholders' Views of Barriers to and Strategies for Broad Implementation

    PubMed Central

    Bell, Sigall K; Smulowitz, Peter B; Woodward, Alan C; Mello, Michelle M; Duva, Anjali Mitter; Boothman, Richard C; Sands, Kenneth

    2012-01-01

    Context The Disclosure, Apology, and Offer (DA&O) model, a response to patient injuries caused by medical care, is an innovative approach receiving national attention for its early success as an alternative to the existing inherently adversarial, inefficient, and inequitable medical liability system. Examples of DA&O programs, however, are few. Methods Through key informant interviews, we investigated the potential for more widespread implementation of this model by provider organizations and liability insurers, defining barriers to implementation and strategies for overcoming them. Our study focused on Massachusetts, but we also explored themes that are broadly generalizable to other states. Findings We found strong support for the DA&O model among key stakeholders, who cited its benefits for both the liability system and patient safety. The respondents did not perceive any insurmountable barriers to broad implementation, and they identified strategies that could be pursued relatively quickly. Such solutions would permit a range of organizations to implement the model without legislative hurdles. Conclusions Although more data are needed about the outcomes of DA&O programs, the model holds considerable promise for transforming the current approach to medical liability and patient safety. PMID:23216427

  19. Disclosure, apology, and offer programs: stakeholders' views of barriers to and strategies for broad implementation.

    PubMed

    Bell, Sigall K; Smulowitz, Peter B; Woodward, Alan C; Mello, Michelle M; Duva, Anjali Mitter; Boothman, Richard C; Sands, Kenneth

    2012-12-01

    The Disclosure, Apology, and Offer (DA&O) model, a response to patient injuries caused by medical care, is an innovative approach receiving national attention for its early success as an alternative to the existing inherently adversarial, inefficient, and inequitable medical liability system. Examples of DA&O programs, however, are few. Through key informant interviews, we investigated the potential for more widespread implementation of this model by provider organizations and liability insurers, defining barriers to implementation and strategies for overcoming them. Our study focused on Massachusetts, but we also explored themes that are broadly generalizable to other states. We found strong support for the DA&O model among key stakeholders, who cited its benefits for both the liability system and patient safety. The respondents did not perceive any insurmountable barriers to broad implementation, and they identified strategies that could be pursued relatively quickly. Such solutions would permit a range of organizations to implement the model without legislative hurdles. Although more data are needed about the outcomes of DA&O programs, the model holds considerable promise for transforming the current approach to medical liability and patient safety. © 2012 Milbank Memorial Fund.

  20. 12 CFR 614.4900 - Foreign exchange.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... with relationship to the customer's financial capability to bear the financial risks assumed. The bank..., where such transactions or positions normally reduce risks in the conduct and management of..., liabilities, and foreign exchange contracts. (3) Outstanding contracts with individual customers and banks. (4...

  1. 12 CFR 614.4900 - Foreign exchange.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... with relationship to the customer's financial capability to bear the financial risks assumed. The bank..., where such transactions or positions normally reduce risks in the conduct and management of..., liabilities, and foreign exchange contracts. (3) Outstanding contracts with individual customers and banks. (4...

  2. 12 CFR 614.4900 - Foreign exchange.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... with relationship to the customer's financial capability to bear the financial risks assumed. The bank..., where such transactions or positions normally reduce risks in the conduct and management of..., liabilities, and foreign exchange contracts. (3) Outstanding contracts with individual customers and banks. (4...

  3. Mixture model normalization for non-targeted gas chromatography/mass spectrometry metabolomics data.

    PubMed

    Reisetter, Anna C; Muehlbauer, Michael J; Bain, James R; Nodzenski, Michael; Stevens, Robert D; Ilkayeva, Olga; Metzger, Boyd E; Newgard, Christopher B; Lowe, William L; Scholtens, Denise M

    2017-02-02

    Metabolomics offers a unique integrative perspective for health research, reflecting genetic and environmental contributions to disease-related phenotypes. Identifying robust associations in population-based or large-scale clinical studies demands large numbers of subjects and therefore sample batching for gas-chromatography/mass spectrometry (GC/MS) non-targeted assays. When run over weeks or months, technical noise due to batch and run-order threatens data interpretability. Application of existing normalization methods to metabolomics is challenged by unsatisfied modeling assumptions and, notably, failure to address batch-specific truncation of low abundance compounds. To curtail technical noise and make GC/MS metabolomics data amenable to analyses describing biologically relevant variability, we propose mixture model normalization (mixnorm) that accommodates truncated data and estimates per-metabolite batch and run-order effects using quality control samples. Mixnorm outperforms other approaches across many metrics, including improved correlation of non-targeted and targeted measurements and superior performance when metabolite detectability varies according to batch. For some metrics, particularly when truncation is less frequent for a metabolite, mean centering and median scaling demonstrate comparable performance to mixnorm. When quality control samples are systematically included in batches, mixnorm is uniquely suited to normalizing non-targeted GC/MS metabolomics data due to explicit accommodation of batch effects, run order and varying thresholds of detectability. Especially in large-scale studies, normalization is crucial for drawing accurate conclusions from non-targeted GC/MS metabolomics data.

  4. Estimation of Microbial Contamination of Food from Prevalence and Concentration Data: Application to Listeria monocytogenes in Fresh Vegetables▿

    PubMed Central

    Crépet, Amélie; Albert, Isabelle; Dervin, Catherine; Carlin, Frédéric

    2007-01-01

    A normal distribution and a mixture model of two normal distributions in a Bayesian approach using prevalence and concentration data were used to establish the distribution of contamination of the food-borne pathogenic bacteria Listeria monocytogenes in unprocessed and minimally processed fresh vegetables. A total of 165 prevalence studies, including 15 studies with concentration data, were taken from the scientific literature and from technical reports and used for statistical analysis. The predicted mean of the normal distribution of the logarithms of viable L. monocytogenes per gram of fresh vegetables was −2.63 log viable L. monocytogenes organisms/g, and its standard deviation was 1.48 log viable L. monocytogenes organisms/g. These values were determined by considering one contaminated sample in prevalence studies in which samples are in fact negative. This deliberate overestimation is necessary to complete calculations. With the mixture model, the predicted mean of the distribution of the logarithm of viable L. monocytogenes per gram of fresh vegetables was −3.38 log viable L. monocytogenes organisms/g and its standard deviation was 1.46 log viable L. monocytogenes organisms/g. The probabilities of fresh unprocessed and minimally processed vegetables being contaminated with concentrations higher than 1, 2, and 3 log viable L. monocytogenes organisms/g were 1.44, 0.63, and 0.17%, respectively. Introducing a sensitivity rate of 80 or 95% in the mixture model had a small effect on the estimation of the contamination. In contrast, introducing a low sensitivity rate (40%) resulted in marked differences, especially for high percentiles. There was a significantly lower estimation of contamination in the papers and reports of 2000 to 2005 than in those of 1988 to 1999 and a lower estimation of contamination of leafy salads than that of sprouts and other vegetables. The interest of the mixture model for the estimation of microbial contamination is discussed. PMID:17098926

  5. 12 CFR 27.3 - Recordkeeping requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... average or normal monthly income. Include alimony, separate maintenance and child support income... any alimony, child support or separate maintenance payments. Exclude any payments on liabilities which...) Date of application. The date on which a signed application is received by the bank. (xix) Sex of...

  6. Realized Volatility Analysis in A Spin Model of Financial Markets

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    We calculate the realized volatility of returns in the spin model of financial markets and examine the returns standardized by the realized volatility. We find that moments of the standardized returns agree with the theoretical values of standard normal variables. This is the first evidence that the return distributions of the spin financial markets are consistent with a finite-variance of mixture of normal distributions that is also observed empirically in real financial markets.

  7. The Manhattan Frame Model-Manhattan World Inference in the Space of Surface Normals.

    PubMed

    Straub, Julian; Freifeld, Oren; Rosman, Guy; Leonard, John J; Fisher, John W

    2018-01-01

    Objects and structures within man-made environments typically exhibit a high degree of organization in the form of orthogonal and parallel planes. Traditional approaches utilize these regularities via the restrictive, and rather local, Manhattan World (MW) assumption which posits that every plane is perpendicular to one of the axes of a single coordinate system. The aforementioned regularities are especially evident in the surface normal distribution of a scene where they manifest as orthogonally-coupled clusters. This motivates the introduction of the Manhattan-Frame (MF) model which captures the notion of an MW in the surface normals space, the unit sphere, and two probabilistic MF models over this space. First, for a single MF we propose novel real-time MAP inference algorithms, evaluate their performance and their use in drift-free rotation estimation. Second, to capture the complexity of real-world scenes at a global scale, we extend the MF model to a probabilistic mixture of Manhattan Frames (MMF). For MMF inference we propose a simple MAP inference algorithm and an adaptive Markov-Chain Monte-Carlo sampling algorithm with Metropolis-Hastings split/merge moves that let us infer the unknown number of mixture components. We demonstrate the versatility of the MMF model and inference algorithm across several scales of man-made environments.

  8. Liability of physicians supervising nonphysician clinicians.

    PubMed

    Paterick, Barbara B; Waterhouse, Blake E; Paterick, Timothy E; Sanbar, Sandy S

    2014-01-01

    Physicians confront a variety of liability issues when supervising nonphysician clinicians (NPC) including: (1) direct liability resulting from a failure to meet the state-defined standards of supervision/collaboration with NPCs; (2) vicarious liability, arising from agency law, where physicians are held accountable for NPC clinical care that does not meet the national standard of care; and (3) responsibility for medical errors when the NPC and physician are co-employees of the corporate enterprise. Physician-NPC co-employee relationships are highlighted because they are new and becoming predominant in existing healthcare models. Because of their novelty, there is a paucity of judicial decisions determining liability for NPC errors in this setting. Knowledge of the existence of these risks will allow physicians to make informed decisions on what relationships they will enter with NPCs and how these relationships will be structured and monitored.

  9. Adaptive Evolution of the GDH2 Allosteric Domain Promotes Gliomagenesis by Resolving IDH1R132H-Induced Metabolic Liabilities.

    PubMed

    Waitkus, Matthew S; Pirozzi, Christopher J; Moure, Casey J; Diplas, Bill H; Hansen, Landon J; Carpenter, Austin B; Yang, Rui; Wang, Zhaohui; Ingram, Brian O; Karoly, Edward D; Mohney, Robert P; Spasojevic, Ivan; McLendon, Roger E; Friedman, Henry S; He, Yiping; Bigner, Darell D; Yan, Hai

    2018-01-01

    Hotspot mutations in the isocitrate dehydrogenase 1 ( IDH1 ) gene occur in a number of human cancers and confer a neomorphic enzyme activity that catalyzes the conversion of α-ketoglutarate (αKG) to the oncometabolite D-(2)-hydroxyglutarate (D2HG). In malignant gliomas, IDH1 R132H expression induces widespread metabolic reprogramming, possibly requiring compensatory mechanisms to sustain the normal biosynthetic requirements of actively proliferating tumor cells. We used genetically engineered mouse models of glioma and quantitative metabolomics to investigate IDH1 R132H -dependent metabolic reprogramming and its potential to induce biosynthetic liabilities that can be exploited for glioma therapy. In gliomagenic neural progenitor cells, IDH1 R132H expression increased the abundance of dipeptide metabolites, depleted key tricarboxylic acid cycle metabolites, and slowed progression of murine gliomas. Notably, expression of glutamate dehydrogenase GDH2, a hominoid-specific enzyme with relatively restricted expression to the brain, was critically involved in compensating for IDH1 R132H -induced metabolic alterations and promoting IDH1 R132H glioma growth. Indeed, we found that recently evolved amino acid substitutions in the GDH2 allosteric domain conferred its nonredundant, glioma-promoting properties in the presence of IDH1 mutation. Our results indicate that among the unique roles for GDH2 in the human forebrain is its ability to limit IDH1 R132H -mediated metabolic liabilities, thus promoting glioma growth in this context. Results from this study raise the possibility that GDH2-specific inhibition may be a viable therapeutic strategy for gliomas with IDH mutations. Significance: These findings show that the homonid-specific brain enzyme GDH2 may be essential to mitigate metabolic liabilities created by IDH1 mutations in glioma, with possible implications to leverage its therapeutic management by IDH1 inhibitors. Cancer Res; 78(1); 36-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  10. Malpractice liability, technology choice and negative defensive medicine.

    PubMed

    Feess, Eberhard

    2012-04-01

    We extend the theoretical literature on the impact of malpractice liability by allowing for two treatment technologies, a safe and a risky one. The safe technology bears no failure risk, but leads to patient-specific disutility since it cannot completely solve the health problems. By contrast, the risky technology (for instance a surgery) may entirely cure patients, but fail with some probability depending on the hospital's care level. Tight malpractice liability increases care levels if the risky technology is chosen at all, but also leads to excessively high incentives for avoiding the liability exposure by adopting the safe technology. We refer to this distortion toward the safe technology as negative defensive medicine. Taking the problem of negative defensive medicine seriously, the second best optimal liability needs to balance between the over-incentive for the safe technology in case of tough liability and the incentive to adopt little care for the risky technology in case of weak liability. In a model with errors in court, we find that gross negligence where hospitals are held liable only for very low care levels outperforms standard negligence, even though standard negligence would implement the first best efficient care level.

  11. Approximation of a radial diffusion model with a multiple-rate model for hetero-disperse particle mixtures

    PubMed Central

    Ju, Daeyoung; Young, Thomas M.; Ginn, Timothy R.

    2012-01-01

    An innovative method is proposed for approximation of the set of radial diffusion equations governing mass exchange between aqueous bulk phase and intra-particle phase for a hetero-disperse mixture of particles such as occur in suspension in surface water, in riverine/estuarine sediment beds, in soils and in aquifer materials. For this purpose the temporal variation of concentration at several uniformly distributed points within a normalized representative particle with spherical, cylindrical or planar shape is fitted with a 2-domain linear reversible mass exchange model. The approximation method is then superposed in order to generalize the model to a hetero-disperse mixture of particles. The method can reduce the computational effort needed in solving the intra-particle mass exchange of a hetero-disperse mixture of particles significantly and also the error due to the approximation is shown to be relatively small. The method is applied to describe desorption batch experiment of 1,2-Dichlorobenzene from four different soils with known particle size distributions and it could produce good agreement with experimental data. PMID:18304692

  12. Substitutability of nicotine alone and an electronic cigarette liquid using a concurrent choice assay in rats: A behavioral economic analysis.

    PubMed

    Smethells, John R; Harris, Andrew C; Burroughs, Danielle; Hursh, Steven R; LeSage, Mark G

    2018-04-01

    For the Food and Drug Administration to effectively regulate tobacco products, the contribution of non-nicotine tobacco constituents to the abuse liability of tobacco must be well understood. Our previous work compared the abuse liability of electronic cigarette refill liquids (EC liquids) and nicotine (Nic) alone when each was available in isolation and found no difference in abuse liability (i.e., demand elasticity). Another, and potentially more sensitive measure, would be to examine abuse liability in a choice context, which also provides a better model of the tobacco marketplace. Demand elasticity for Nic alone and an EC liquid were measured when only one formulation was available (alone-price demand) and when both formulations were concurrently available (own-price demand), allowing an assessment of the degree to which each formulation served as a substitute (cross-price demand) when available at a low fixed-price. Own-price demand for both formulations were more elastic compared to alone-price demand, indicating that availability of a substitute increased demand elasticity. During concurrent access, consumption of the fixed-price formulation increased as the unit-price of the other formulation increased. The rate of increase was similar between formulations, indicating that they served as symmetrical substitutes. The cross-price model reliably quantified the substitutability of both nicotine formulations and indicated that the direct CNS effects of non-nicotine constituents in EC liquid did not alter its abuse liability compared to Nic. These data highlight the sensitivity of this model and its potential utility for examining the relative abuse liability and substitutability of tobacco products. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Intestinal absorption of an arginine-containing peptide in cystinuria

    PubMed Central

    Asatoor, A. M.; Harrison, B. D. W.; Milne, M. D.; Prosser, D. I.

    1972-01-01

    Separate tolerance tests involving oral intake of the dipeptide, L-arginyl-L-aspartate, and of a corresponding free amino acid mixture, were carried out in a single type 2 cystinuric patient. Absorption of aspartate was within normal limits, whilst that of arginine was normal after the peptide but considerably reduced after the amino acid mixture. The results are compared with the increments of serum arginine found in eight normal subjects after the oral intake of the free amino acid mixture. Analyses of urinary pyrrolidine and of tetramethylenediamine in urine samples obtained after the two tolerance tests in the patient support the view that arginine absorption was subnormal after the amino acid mixture but within normal limits after the dipeptide. PMID:5045711

  14. A NEW METHOD OF PEAK DETECTION FOR ANALYSIS OF COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY MASS SPECTROMETRY DATA.

    PubMed

    Kim, Seongho; Ouyang, Ming; Jeong, Jaesik; Shen, Changyu; Zhang, Xiang

    2014-06-01

    We develop a novel peak detection algorithm for the analysis of comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOF MS) data using normal-exponential-Bernoulli (NEB) and mixture probability models. The algorithm first performs baseline correction and denoising simultaneously using the NEB model, which also defines peak regions. Peaks are then picked using a mixture of probability distribution to deal with the co-eluting peaks. Peak merging is further carried out based on the mass spectral similarities among the peaks within the same peak group. The algorithm is evaluated using experimental data to study the effect of different cut-offs of the conditional Bayes factors and the effect of different mixture models including Poisson, truncated Gaussian, Gaussian, Gamma, and exponentially modified Gaussian (EMG) distributions, and the optimal version is introduced using a trial-and-error approach. We then compare the new algorithm with two existing algorithms in terms of compound identification. Data analysis shows that the developed algorithm can detect the peaks with lower false discovery rates than the existing algorithms, and a less complicated peak picking model is a promising alternative to the more complicated and widely used EMG mixture models.

  15. Supervised extensions of chemography approaches: case studies of chemical liabilities assessment

    PubMed Central

    2014-01-01

    Chemical liabilities, such as adverse effects and toxicity, play a significant role in modern drug discovery process. In silico assessment of chemical liabilities is an important step aimed to reduce costs and animal testing by complementing or replacing in vitro and in vivo experiments. Herein, we propose an approach combining several classification and chemography methods to be able to predict chemical liabilities and to interpret obtained results in the context of impact of structural changes of compounds on their pharmacological profile. To our knowledge for the first time, the supervised extension of Generative Topographic Mapping is proposed as an effective new chemography method. New approach for mapping new data using supervised Isomap without re-building models from the scratch has been proposed. Two approaches for estimation of model’s applicability domain are used in our study to our knowledge for the first time in chemoinformatics. The structural alerts responsible for the negative characteristics of pharmacological profile of chemical compounds has been found as a result of model interpretation. PMID:24868246

  16. Modeling of active transmembrane transport in a mixture theory framework.

    PubMed

    Ateshian, Gerard A; Morrison, Barclay; Hung, Clark T

    2010-05-01

    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature.

  17. Examining the effect of initialization strategies on the performance of Gaussian mixture modeling.

    PubMed

    Shireman, Emilie; Steinley, Douglas; Brusco, Michael J

    2017-02-01

    Mixture modeling is a popular technique for identifying unobserved subpopulations (e.g., components) within a data set, with Gaussian (normal) mixture modeling being the form most widely used. Generally, the parameters of these Gaussian mixtures cannot be estimated in closed form, so estimates are typically obtained via an iterative process. The most common estimation procedure is maximum likelihood via the expectation-maximization (EM) algorithm. Like many approaches for identifying subpopulations, finite mixture modeling can suffer from locally optimal solutions, and the final parameter estimates are dependent on the initial starting values of the EM algorithm. Initial values have been shown to significantly impact the quality of the solution, and researchers have proposed several approaches for selecting the set of starting values. Five techniques for obtaining starting values that are implemented in popular software packages are compared. Their performances are assessed in terms of the following four measures: (1) the ability to find the best observed solution, (2) settling on a solution that classifies observations correctly, (3) the number of local solutions found by each technique, and (4) the speed at which the start values are obtained. On the basis of these results, a set of recommendations is provided to the user.

  18. Mixture models in diagnostic meta-analyses--clustering summary receiver operating characteristic curves accounted for heterogeneity and correlation.

    PubMed

    Schlattmann, Peter; Verba, Maryna; Dewey, Marc; Walther, Mario

    2015-01-01

    Bivariate linear and generalized linear random effects are frequently used to perform a diagnostic meta-analysis. The objective of this article was to apply a finite mixture model of bivariate normal distributions that can be used for the construction of componentwise summary receiver operating characteristic (sROC) curves. Bivariate linear random effects and a bivariate finite mixture model are used. The latter model is developed as an extension of a univariate finite mixture model. Two examples, computed tomography (CT) angiography for ruling out coronary artery disease and procalcitonin as a diagnostic marker for sepsis, are used to estimate mean sensitivity and mean specificity and to construct sROC curves. The suggested approach of a bivariate finite mixture model identifies two latent classes of diagnostic accuracy for the CT angiography example. Both classes show high sensitivity but mainly two different levels of specificity. For the procalcitonin example, this approach identifies three latent classes of diagnostic accuracy. Here, sensitivities and specificities are quite different as such that sensitivity increases with decreasing specificity. Additionally, the model is used to construct componentwise sROC curves and to classify individual studies. The proposed method offers an alternative approach to model between-study heterogeneity in a diagnostic meta-analysis. Furthermore, it is possible to construct sROC curves even if a positive correlation between sensitivity and specificity is present. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Use of a glimpsing model to understand the performance of listeners with and without hearing loss in spatialized speech mixtures

    PubMed Central

    Best, Virginia; Mason, Christine R.; Swaminathan, Jayaganesh; Roverud, Elin; Kidd, Gerald

    2017-01-01

    In many situations, listeners with sensorineural hearing loss demonstrate reduced spatial release from masking compared to listeners with normal hearing. This deficit is particularly evident in the “symmetric masker” paradigm in which competing talkers are located to either side of a central target talker. However, there is some evidence that reduced target audibility (rather than a spatial deficit per se) under conditions of spatial separation may contribute to the observed deficit. In this study a simple “glimpsing” model (applied separately to each ear) was used to isolate the target information that is potentially available in binaural speech mixtures. Intelligibility of these glimpsed stimuli was then measured directly. Differences between normally hearing and hearing-impaired listeners observed in the natural binaural condition persisted for the glimpsed condition, despite the fact that the task no longer required segregation or spatial processing. This result is consistent with the idea that the performance of listeners with hearing loss in the spatialized mixture was limited by their ability to identify the target speech based on sparse glimpses, possibly as a result of some of those glimpses being inaudible. PMID:28147587

  20. The legal liability regime: how well is it doing in assuring quality, accounting for costs, and coping with an evolving reality in the health care marketplace?

    PubMed

    Blumstein, James F

    2002-01-01

    Professor Blumstein's timely article deals with two competing paradigms that provide the poles in the spectrum of legal liability regimes. The "professional" or "scientific" model of liability assumes a rigidly normative approach to medical practice while the second more recent paradigm reflects the principles of marketplace economics in considering cost and resource availability to determine quality of care standards. Professor Blumstein concludes that the traditional approach to determining legal liability is being eroded by both the economics of managed care and the recent emphasis on systemic management of health care to promote patient safety, and that the traditional regime will have to "bend" in order to remain legally viable.

  1. A multivariate spatial mixture model for areal data: examining regional differences in standardized test scores

    PubMed Central

    Neelon, Brian; Gelfand, Alan E.; Miranda, Marie Lynn

    2013-01-01

    Summary Researchers in the health and social sciences often wish to examine joint spatial patterns for two or more related outcomes. Examples include infant birth weight and gestational length, psychosocial and behavioral indices, and educational test scores from different cognitive domains. We propose a multivariate spatial mixture model for the joint analysis of continuous individual-level outcomes that are referenced to areal units. The responses are modeled as a finite mixture of multivariate normals, which accommodates a wide range of marginal response distributions and allows investigators to examine covariate effects within subpopulations of interest. The model has a hierarchical structure built at the individual level (i.e., individuals are nested within areal units), and thus incorporates both individual- and areal-level predictors as well as spatial random effects for each mixture component. Conditional autoregressive (CAR) priors on the random effects provide spatial smoothing and allow the shape of the multivariate distribution to vary flexibly across geographic regions. We adopt a Bayesian modeling approach and develop an efficient Markov chain Monte Carlo model fitting algorithm that relies primarily on closed-form full conditionals. We use the model to explore geographic patterns in end-of-grade math and reading test scores among school-age children in North Carolina. PMID:26401059

  2. Mixed and Mixture Regression Models for Continuous Bounded Responses Using the Beta Distribution

    ERIC Educational Resources Information Center

    Verkuilen, Jay; Smithson, Michael

    2012-01-01

    Doubly bounded continuous data are common in the social and behavioral sciences. Examples include judged probabilities, confidence ratings, derived proportions such as percent time on task, and bounded scale scores. Dependent variables of this kind are often difficult to analyze using normal theory models because their distributions may be quite…

  3. A Bayesian Beta-Mixture Model for Nonparametric IRT (BBM-IRT)

    ERIC Educational Resources Information Center

    Arenson, Ethan A.; Karabatsos, George

    2017-01-01

    Item response models typically assume that the item characteristic (step) curves follow a logistic or normal cumulative distribution function, which are strictly monotone functions of person test ability. Such assumptions can be overly-restrictive for real item response data. We propose a simple and more flexible Bayesian nonparametric IRT model…

  4. 40 CFR 265.141 - Definitions of terms as used in this subpart.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cash or sold or consumed during the normal operating cycle of the business. Current liabilities means... nature. (h) Substantial business relationship means the extent of a business relationship necessary under.... A “substantial business relationship” must arise from a pattern of recent or ongoing business...

  5. 40 CFR 264.141 - Definitions of terms as used in this subpart.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... during the normal operating cycle of the business. Current liabilities means obligations whose... occurrence means an occurrence which is not continuous or repeated in nature. (h) Substantial business relationship means the extent of a business relationship necessary under applicable State law to make a...

  6. Multifactorial disease risk calculator: Risk prediction for multifactorial disease pedigrees.

    PubMed

    Campbell, Desmond D; Li, Yiming; Sham, Pak C

    2018-03-01

    Construction of multifactorial disease models from epidemiological findings and their application to disease pedigrees for risk prediction is nontrivial for all but the simplest of cases. Multifactorial Disease Risk Calculator is a web tool facilitating this. It provides a user-friendly interface, extending a reported methodology based on a liability-threshold model. Multifactorial disease models incorporating all the following features in combination are handled: quantitative risk factors (including polygenic scores), categorical risk factors (including major genetic risk loci), stratified age of onset curves, and the partition of the population variance in disease liability into genetic, shared, and unique environment effects. It allows the application of such models to disease pedigrees. Pedigree-related outputs are (i) individual disease risk for pedigree members, (ii) n year risk for unaffected pedigree members, and (iii) the disease pedigree's joint liability distribution. Risk prediction for each pedigree member is based on using the constructed disease model to appropriately weigh evidence on disease risk available from personal attributes and family history. Evidence is used to construct the disease pedigree's joint liability distribution. From this, lifetime and n year risk can be predicted. Example disease models and pedigrees are provided at the website and are used in accompanying tutorials to illustrate the features available. The website is built on an R package which provides the functionality for pedigree validation, disease model construction, and risk prediction. Website: http://grass.cgs.hku.hk:3838/mdrc/current. © 2017 WILEY PERIODICALS, INC.

  7. Evaluating sufficient similarity for drinking-water disinfection by-product (DBP) mixtures with bootstrap hypothesis test procedures.

    PubMed

    Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn

    2009-01-01

    In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.

  8. Internal structure of shock waves in disparate mass mixtures

    NASA Technical Reports Server (NTRS)

    Chung, Chan-Hong; De Witt, Kenneth J.; Jeng, Duen-Ren; Penko, Paul F.

    1992-01-01

    The detailed flow structure of a normal shock wave for a gas mixture is investigated using the direct-simulation Monte Carlo method. A variable diameter hard-sphere (VDHS) model is employed to investigate the effect of different viscosity temperature exponents (VTE) for each species in a gas mixture. Special attention is paid to the irregular behavior in the density profiles which was previously observed in a helium-xenon experiment. It is shown that the VTE can have substantial effects in the prediction of the structure of shock waves. The variable hard-sphere model of Bird shows good agreement, but with some limitations, with the experimental data if a common VTE is chosen properly for each case. The VDHS model shows better agreement with the experimental data without adjusting the VTE. The irregular behavior of the light-gas component in shock waves of disparate mass mixtures is observed not only in the density profile, but also in the parallel temperature profile. The strength of the shock wave, the type of molecular interactions, and the mole fraction of heavy species have substantial effects on the existence and structure of the irregularities.

  9. Continuous plutonium dissolution apparatus

    DOEpatents

    Meyer, F.G.; Tesitor, C.N.

    1974-02-26

    This invention is concerned with continuous dissolution of metals such as plutonium. A high normality acid mixture is fed into a boiler vessel, vaporized, and subsequently condensed as a low normality acid mixture. The mixture is then conveyed to a dissolution vessel and contacted with the plutonium metal to dissolve the plutonium in the dissolution vessel, reacting therewith forming plutonium nitrate. The reaction products are then conveyed to the mixing vessel and maintained soluble by the high normality acid, with separation and removal of the desired constituent. (Official Gazette)

  10. A comparative study of mixture cure models with covariate

    NASA Astrophysics Data System (ADS)

    Leng, Oh Yit; Khalid, Zarina Mohd

    2017-05-01

    In survival analysis, the survival time is assumed to follow a non-negative distribution, such as the exponential, Weibull, and log-normal distributions. In some cases, the survival time is influenced by some observed factors. The absence of these observed factors may cause an inaccurate estimation in the survival function. Therefore, a survival model which incorporates the influences of observed factors is more appropriate to be used in such cases. These observed factors are included in the survival model as covariates. Besides that, there are cases where a group of individuals who are cured, that is, not experiencing the event of interest. Ignoring the cure fraction may lead to overestimate in estimating the survival function. Thus, a mixture cure model is more suitable to be employed in modelling survival data with the presence of a cure fraction. In this study, three mixture cure survival models are used to analyse survival data with a covariate and a cure fraction. The first model includes covariate in the parameterization of the susceptible individuals survival function, the second model allows the cure fraction to depend on covariate, and the third model incorporates covariate in both cure fraction and survival function of susceptible individuals. This study aims to compare the performance of these models via a simulation approach. Therefore, in this study, survival data with varying sample sizes and cure fractions are simulated and the survival time is assumed to follow the Weibull distribution. The simulated data are then modelled using the three mixture cure survival models. The results show that the three mixture cure models are more appropriate to be used in modelling survival data with the presence of cure fraction and an observed factor.

  11. Identification of Allelic Imbalance with a Statistical Model for Subtle Genomic Mosaicism

    PubMed Central

    Xia, Rui; Vattathil, Selina; Scheet, Paul

    2014-01-01

    Genetic heterogeneity in a mixed sample of tumor and normal DNA can confound characterization of the tumor genome. Numerous computational methods have been proposed to detect aberrations in DNA samples from tumor and normal tissue mixtures. Most of these require tumor purities to be at least 10–15%. Here, we present a statistical model to capture information, contained in the individual's germline haplotypes, about expected patterns in the B allele frequencies from SNP microarrays while fully modeling their magnitude, the first such model for SNP microarray data. Our model consists of a pair of hidden Markov models—one for the germline and one for the tumor genome—which, conditional on the observed array data and patterns of population haplotype variation, have a dependence structure induced by the relative imbalance of an individual's inherited haplotypes. Together, these hidden Markov models offer a powerful approach for dealing with mixtures of DNA where the main component represents the germline, thus suggesting natural applications for the characterization of primary clones when stromal contamination is extremely high, and for identifying lesions in rare subclones of a tumor when tumor purity is sufficient to characterize the primary lesions. Our joint model for germline haplotypes and acquired DNA aberration is flexible, allowing a large number of chromosomal alterations, including balanced and imbalanced losses and gains, copy-neutral loss-of-heterozygosity (LOH) and tetraploidy. We found our model (which we term J-LOH) to be superior for localizing rare aberrations in a simulated 3% mixture sample. More generally, our model provides a framework for full integration of the germline and tumor genomes to deal more effectively with missing or uncertain features, and thus extract maximal information from difficult scenarios where existing methods fail. PMID:25166618

  12. Combining Mixture Components for Clustering*

    PubMed Central

    Baudry, Jean-Patrick; Raftery, Adrian E.; Celeux, Gilles; Lo, Kenneth; Gottardo, Raphaël

    2010-01-01

    Model-based clustering consists of fitting a mixture model to data and identifying each cluster with one of its components. Multivariate normal distributions are typically used. The number of clusters is usually determined from the data, often using BIC. In practice, however, individual clusters can be poorly fitted by Gaussian distributions, and in that case model-based clustering tends to represent one non-Gaussian cluster by a mixture of two or more Gaussian distributions. If the number of mixture components is interpreted as the number of clusters, this can lead to overestimation of the number of clusters. This is because BIC selects the number of mixture components needed to provide a good approximation to the density, rather than the number of clusters as such. We propose first selecting the total number of Gaussian mixture components, K, using BIC and then combining them hierarchically according to an entropy criterion. This yields a unique soft clustering for each number of clusters less than or equal to K. These clusterings can be compared on substantive grounds, and we also describe an automatic way of selecting the number of clusters via a piecewise linear regression fit to the rescaled entropy plot. We illustrate the method with simulated data and a flow cytometry dataset. Supplemental Materials are available on the journal Web site and described at the end of the paper. PMID:20953302

  13. Human Language Technology: Opportunities and Challenges

    DTIC Science & Technology

    2005-01-01

    because of the connections to and reliance on signal processing. Audio diarization critically includes indexing of speakers [12], since speaker ...to reduce inter- speaker variability in training. Standard techniques include vocal-tract length normalization, adaptation of acoustic models using...maximum likelihood linear regression (MLLR), and speaker -adaptive training based on MLLR. The acoustic models are mixtures of Gaussians, typically with

  14. Determining the Number of Component Clusters in the Standard Multivariate Normal Mixture Model Using Model-Selection Criteria.

    DTIC Science & Technology

    1983-06-16

    has been advocated by Gnanadesikan and 𔃾ilk (1969), and others in the literature. This suggests that, if we use the formal signficance test type...American Statistical Asso., 62, 1159-1178. Gnanadesikan , R., and Wilk, M..B. (1969). Data Analytic Methods in Multi- variate Statistical Analysis. In

  15. How can we improve our understanding of cardiovascular safety liabilities to develop safer medicines?

    PubMed Central

    Laverty, HG; Benson, C; Cartwright, EJ; Cross, MJ; Garland, C; Hammond, T; Holloway, C; McMahon, N; Milligan, J; Park, BK; Pirmohamed, M; Pollard, C; Radford, J; Roome, N; Sager, P; Singh, S; Suter, T; Suter, W; Trafford, A; Volders, PGA; Wallis, R; Weaver, R; York, M; Valentin, JP

    2011-01-01

    Given that cardiovascular safety liabilities remain a major cause of drug attrition during preclinical and clinical development, adverse drug reactions, and post-approval withdrawal of medicines, the Medical Research Council Centre for Drug Safety Science hosted a workshop to discuss current challenges in determining, understanding and addressing ‘Cardiovascular Toxicity of Medicines’. This article summarizes the key discussions from the workshop that aimed to address three major questions: (i) what are the key cardiovascular safety liabilities in drug discovery, drug development and clinical practice? (ii) how good are preclinical and clinical strategies for detecting cardiovascular liabilities? and (iii) do we have a mechanistic understanding of these liabilities? It was concluded that in order to understand, address and ultimately reduce cardiovascular safety liabilities of new therapeutic agents there is an urgent need to: Fully characterize the incidence, prevalence and impact of drug-induced cardiovascular issues at all stages of the drug development process. Ascertain the predictive value of existing non-clinical models and assays towards the clinical outcome. Understand the mechanistic basis of cardiovascular liabilities; by addressing areas where it is currently not possible to predict clinical outcome based on preclinical safety data. Provide scientists in all disciplines with additional skills to enable them to better integrate preclinical and clinical data and to better understand the biological and clinical significance of observed changes. Develop more appropriate, highly relevant and predictive tools and assays to identify and wherever feasible to eliminate cardiovascular safety liabilities from molecules and wherever appropriate to develop clinically relevant and reliable safety biomarkers. PMID:21306581

  16. Strategies for Limiting Engineers' Potential Liability for Indoor Air Quality Problems.

    PubMed

    von Oppenfeld, Rolf R; Freeze, Mark E; Sabo, Sean M

    1998-10-01

    Engineers face indoor air quality (IAQ) issues at the design phase of building construction as well as during the investigation and mitigation of potential indoor air pollution problems during building operation. IAQ issues that can be identified are "building-related illnesses" that may include problems of volatile organic compounds (VOCs). IAQ issues that cannot be identified are termed "sick building syndrome." Frequently, microorganism-caused illnesses are difficult to confirm. Engineers who provide professional services that directly or indirectly impact IAQ face significant potential liability to clients and third parties when performing these duties. Potential theories supporting liability claims for IAQ problems against engineers include breach of contract and various common law tort theories such as negligence and negligent misrepresentation. Furthermore, an increasing number of federal, state, and local regulations affect IAQ issues and can directly increase the potential liability of engineers. A duty to disclose potential or actual air quality concerns to third parties may apply for engineers in given circumstances. Such a duty may arise from judicial precedent, the Model Guide for Professional Conduct for Engineers, or the Code of Ethics for Engineers. Practical strategies engineers can use to protect themselves from liability include regular training and continuing education in relevant regulatory, scientific, and case law developments; detailed documentation and recordkeeping practices; adequate insurance coverage; contractual indemnity clauses; contractual provisions limiting liability to the scope of work performed; and contractual provisions limiting the extent of liability for engineers' negligence. Furthermore, through the proper use of building materials and construction techniques, an engineer or other design professional can effectively limit the potential for IAQ liability.

  17. A NEW METHOD OF PEAK DETECTION FOR ANALYSIS OF COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY MASS SPECTROMETRY DATA*

    PubMed Central

    Kim, Seongho; Ouyang, Ming; Jeong, Jaesik; Shen, Changyu; Zhang, Xiang

    2014-01-01

    We develop a novel peak detection algorithm for the analysis of comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOF MS) data using normal-exponential-Bernoulli (NEB) and mixture probability models. The algorithm first performs baseline correction and denoising simultaneously using the NEB model, which also defines peak regions. Peaks are then picked using a mixture of probability distribution to deal with the co-eluting peaks. Peak merging is further carried out based on the mass spectral similarities among the peaks within the same peak group. The algorithm is evaluated using experimental data to study the effect of different cut-offs of the conditional Bayes factors and the effect of different mixture models including Poisson, truncated Gaussian, Gaussian, Gamma, and exponentially modified Gaussian (EMG) distributions, and the optimal version is introduced using a trial-and-error approach. We then compare the new algorithm with two existing algorithms in terms of compound identification. Data analysis shows that the developed algorithm can detect the peaks with lower false discovery rates than the existing algorithms, and a less complicated peak picking model is a promising alternative to the more complicated and widely used EMG mixture models. PMID:25264474

  18. A path integral approach to asset-liability management

    NASA Astrophysics Data System (ADS)

    Decamps, Marc; De Schepper, Ann; Goovaerts, Marc

    2006-05-01

    Functional integrals constitute a powerful tool in the investigation of financial models. In the recent econophysics literature, this technique was successfully used for the pricing of a number of derivative securities. In the present contribution, we introduce this approach to the field of asset-liability management. We work with a representation of cash flows by means of a two-dimensional delta-function perturbation, in the case of a Brownian model and a geometric Brownian model. We derive closed-form solutions for a finite horizon ALM policy. The results are numerically and graphically illustrated.

  19. Teleradiology: a case study of the economic and legal considerations in international trade in telemedicine.

    PubMed

    McLean, Thomas R; Richards, Edward P

    2006-01-01

    Growth in the global market for telemedical services is being driven by economics. Two operational models are already recognizable. "Nighthawk" providers are virtually indistinguishable from their domestic counterparts with respect to medical malpractice liability and price for service. Indian providers, in contrast, offer deep price discounts on services, but jurisdictional loopholes are likely to allow these providers a method to avoid medical malpractice liability. Hospitals that outsource their radiology services need to be aware of these differences, because hiring Indian telemedical providers will likely result in a shift of medical malpractice liability from providers to hospitals.

  20. Bayesian mixture modeling for blood sugar levels of diabetes mellitus patients (case study in RSUD Saiful Anwar Malang Indonesia)

    NASA Astrophysics Data System (ADS)

    Budi Astuti, Ani; Iriawan, Nur; Irhamah; Kuswanto, Heri; Sasiarini, Laksmi

    2017-10-01

    Bayesian statistics proposes an approach that is very flexible in the number of samples and distribution of data. Bayesian Mixture Model (BMM) is a Bayesian approach for multimodal models. Diabetes Mellitus (DM) is more commonly known in the Indonesian community as sweet pee. This disease is one type of chronic non-communicable diseases but it is very dangerous to humans because of the effects of other diseases complications caused. WHO reports in 2013 showed DM disease was ranked 6th in the world as the leading causes of human death. In Indonesia, DM disease continues to increase over time. These research would be studied patterns and would be built the BMM models of the DM data through simulation studies where the simulation data built on cases of blood sugar levels of DM patients in RSUD Saiful Anwar Malang. The results have been successfully demonstrated pattern of distribution of the DM data which has a normal mixture distribution. The BMM models have succeed to accommodate the real condition of the DM data based on the data driven concept.

  1. Genetic Liability to Disability Pension in Women and Men: A Prospective Population-Based Twin Study

    PubMed Central

    Narusyte, Jurgita; Ropponen, Annina; Silventoinen, Karri; Alexanderson, Kristina; Kaprio, Jaakko; Samuelsson, Åsa; Svedberg, Pia

    2011-01-01

    Background Previous studies of risk factors for disability pension (DP) have mainly focused on psychosocial, or environmental, factors, while the relative importance of genetic effects has been less studied. Sex differences in biological mechanisms have not been investigated at all. Methods The study sample included 46,454 Swedish twins, consisting of 23,227 complete twin pairs, born 1928–1958, who were followed during 1993–2008. Data on DP, including diagnoses, were obtained from the National Social Insurance Agency. Within-pair similarity in liability to DP was assessed by calculating intraclass correlations. Genetic and environmental influences on liability to DP were estimated by applying discrete-time frailty modeling. Results During follow-up, 7,669 individuals were granted DP (18.8% women and 14.1% men). Intraclass correlations were generally higher in MZ pairs than DZ pairs, while DZ same-sexed pairs were more similar than opposite-sexed pairs. The best-fitting model indicated that genetic factors contributed 49% (95% CI: 39–59) to the variance in DP due to mental diagnoses, 35% (95% CI: 29–41) due to musculoskeletal diagnoses, and 27% (95% CI: 20–33) due to all other diagnoses. In both sexes, genetic effects common to all ages explained one-third, whereas age-specific factors almost two-thirds, of the total variance in liability to DP irrespective of diagnosis. Sex differences in liability to DP were indicated, in that partly different sets of genes were found to operate in women and men, even though the magnitude of genetic variance explained was equal for both sexes. Conclusions The findings of the study suggest that genetic effects are important for liability to DP due to different diagnoses. Moreover, genetic contributions to liability to DP tend to differ between women and men, even though the overall relative contribution of genetic influences does not differ by sex. Hence, the pathways leading to DP might differ between women and men. PMID:21850258

  2. Personal exposure to mixtures of volatile organic compounds: modeling and further analysis of the RIOPA data.

    PubMed

    Batterman, Stuart; Su, Feng-Chiao; Li, Shi; Mukherjee, Bhramar; Jia, Chunrong

    2014-06-01

    Emission sources of volatile organic compounds (VOCs*) are numerous and widespread in both indoor and outdoor environments. Concentrations of VOCs indoors typically exceed outdoor levels, and most people spend nearly 90% of their time indoors. Thus, indoor sources generally contribute the majority of VOC exposures for most people. VOC exposure has been associated with a wide range of acute and chronic health effects; for example, asthma, respiratory diseases, liver and kidney dysfunction, neurologic impairment, and cancer. Although exposures to most VOCs for most persons fall below health-based guidelines, and long-term trends show decreases in ambient emissions and concentrations, a subset of individuals experience much higher exposures that exceed guidelines. Thus, exposure to VOCs remains an important environmental health concern. The present understanding of VOC exposures is incomplete. With the exception of a few compounds, concentration and especially exposure data are limited; and like other environmental data, VOC exposure data can show multiple modes, low and high extreme values, and sometimes a large portion of data below method detection limits (MDLs). Field data also show considerable spatial or interpersonal variability, and although evidence is limited, temporal variability seems high. These characteristics can complicate modeling and other analyses aimed at risk assessment, policy actions, and exposure management. In addition to these analytic and statistical issues, exposure typically occurs as a mixture, and mixture components may interact or jointly contribute to adverse effects. However most pollutant regulations, guidelines, and studies remain focused on single compounds, and thus may underestimate cumulative exposures and risks arising from coexposures. In addition, the composition of VOC mixtures has not been thoroughly investigated, and mixture components show varying and complex dependencies. Finally, although many factors are known to affect VOC exposures, many personal, environmental, and socioeconomic determinants remain to be identified, and the significance and applicability of the determinants reported in the literature are uncertain. To help answer these unresolved questions and overcome limitations of previous analyses, this project used several novel and powerful statistical modeling and analysis techniques and two large data sets. The overall objectives of this project were (1) to identify and characterize exposure distributions (including extreme values), (2) evaluate mixtures (including dependencies), and (3) identify determinants of VOC exposure. METHODS VOC data were drawn from two large data sets: the Relationships of Indoor, Outdoor, and Personal Air (RIOPA) study (1999-2001) and the National Health and Nutrition Examination Survey (NHANES; 1999-2000). The RIOPA study used a convenience sample to collect outdoor, indoor, and personal exposure measurements in three cities (Elizabeth, NJ; Houston, TX; Los Angeles, CA). In each city, approximately 100 households with adults and children who did not smoke were sampled twice for 18 VOCs. In addition, information about 500 variables associated with exposure was collected. The NHANES used a nationally representative sample and included personal VOC measurements for 851 participants. NHANES sampled 10 VOCs in common with RIOPA. Both studies used similar sampling methods and study periods. Specific Aim 1. To estimate and model extreme value exposures, extreme value distribution models were fitted to the top 10% and 5% of VOC exposures. Health risks were estimated for individual VOCs and for three VOC mixtures. Simulated extreme value data sets, generated for each VOC and for fitted extreme value and lognormal distributions, were compared with measured concentrations (RIOPA observations) to evaluate each model's goodness of fit. Mixture distributions were fitted with the conventional finite mixture of normal distributions and the semi-parametric Dirichlet process mixture (DPM) of normal distributions for three individual VOCs (chloroform, 1,4-DCB, and styrene). Goodness of fit for these full distribution models was also evaluated using simulated data. Specific Aim 2. Mixtures in the RIOPA VOC data set were identified using positive matrix factorization (PMF) and by toxicologic mode of action. Dependency structures of a mixture's components were examined using mixture fractions and were modeled using copulas, which address correlations of multiple components across their entire distributions. Five candidate copulas (Gaussian, t, Gumbel, Clayton, and Frank) were evaluated, and the performance of fitted models was evaluated using simulation and mixture fractions. Cumulative cancer risks were calculated for mixtures, and results from copulas and multivariate lognormal models were compared with risks based on RIOPA observations. Specific Aim 3. Exposure determinants were identified using stepwise regressions and linear mixed-effects models (LMMs). Specific Aim 1. Extreme value exposures in RIOPA typically were best fitted by three-parameter generalized extreme value (GEV) distributions, and sometimes by the two-parameter Gumbel distribution. In contrast, lognormal distributions significantly underestimated both the level and likelihood of extreme values. Among the VOCs measured in RIOPA, 1,4-dichlorobenzene (1,4-DCB) was associated with the greatest cancer risks; for example, for the highest 10% of measurements of 1,4-DCB, all individuals had risk levels above 10(-4), and 13% of all participants had risk levels above 10(-2). Of the full-distribution models, the finite mixture of normal distributions with two to four clusters and the DPM of normal distributions had superior performance in comparison with the lognormal models. DPM distributions provided slightly better fit than the finite mixture distributions; the advantages of the DPM model were avoiding certain convergence issues associated with the finite mixture distributions, adaptively selecting the number of needed clusters, and providing uncertainty estimates. Although the results apply to the RIOPA data set, GEV distributions and mixture models appear more broadly applicable. These models can be used to simulate VOC distributions, which are neither normally nor lognormally distributed, and they accurately represent the highest exposures, which may have the greatest health significance. Specific Aim 2. Four VOC mixtures were identified and apportioned by PMF; they represented gasoline vapor, vehicle exhaust, chlorinated solvents and disinfection byproducts, and cleaning products and odorants. The last mixture (cleaning products and odorants) accounted for the largest fraction of an individual's total exposure (average of 42% across RIOPA participants). Often, a single compound dominated a mixture but the mixture fractions were heterogeneous; that is, the fractions of the compounds changed with the concentration of the mixture. Three VOC mixtures were identified by toxicologic mode of action and represented VOCs associated with hematopoietic, liver, and renal tumors. Estimated lifetime cumulative cancer risks exceeded 10(-3) for about 10% of RIOPA participants. The dependency structures of the VOC mixtures in the RIOPA data set fitted Gumbel (two mixtures) and t copulas (four mixtures). These copula types emphasize dependencies found in the upper and lower tails of a distribution. The copulas reproduced both risk predictions and exposure fractions with a high degree of accuracy and performed better than multivariate lognormal distributions. Specific Aim 3. In an analysis focused on the home environment and the outdoor (close to home) environment, home VOC concentrations dominated personal exposures (66% to 78% of the total exposure, depending on VOC); this was largely the result of the amount of time participants spent at home and the fact that indoor concentrations were much higher than outdoor concentrations for most VOCs. In a different analysis focused on the sources inside the home and outside (but close to the home), it was assumed that 100% of VOCs from outside sources would penetrate the home. Outdoor VOC sources accounted for 5% (d-limonene) to 81% (carbon tetrachloride [CTC]) of the total exposure. Personal exposure and indoor measurements had similar determinants depending on the VOC. Gasoline-related VOCs (e.g., benzene and methyl tert-butyl ether [MTBE]) were associated with city, residences with attached garages, pumping gas, wind speed, and home air exchange rate (AER). Odorant and cleaning-related VOCs (e.g., 1,4-DCB and chloroform) also were associated with city, and a residence's AER, size, and family members showering. Dry-cleaning and industry-related VOCs (e.g., tetrachloroethylene [or perchloroethylene, PERC] and trichloroethylene [TCE]) were associated with city, type of water supply to the home, and visits to the dry cleaner. These and other relationships were significant, they explained from 10% to 40% of the variance in the measurements, and are consistent with known emission sources and those reported in the literature. Outdoor concentrations of VOCs had only two determinants in common: city and wind speed. Overall, personal exposure was dominated by the home setting, although a large fraction of indoor VOC concentrations were due to outdoor sources. City of residence, personal activities, household characteristics, and meteorology were significant determinants. Concentrations in RIOPA were considerably lower than levels in the nationally representative NHANES for all VOCs except MTBE and 1,4-DCB. Differences between RIOPA and NHANES results can be explained by contrasts between the sampling designs and staging in the two studies, and by differences in the demographics, smoking, employment, occupations, and home locations. (ABSTRACT TRUNCATED)

  3. Bayesian spatiotemporal crash frequency models with mixture components for space-time interactions.

    PubMed

    Cheng, Wen; Gill, Gurdiljot Singh; Zhang, Yongping; Cao, Zhong

    2018-03-01

    The traffic safety research has developed spatiotemporal models to explore the variations in the spatial pattern of crash risk over time. Many studies observed notable benefits associated with the inclusion of spatial and temporal correlation and their interactions. However, the safety literature lacks sufficient research for the comparison of different temporal treatments and their interaction with spatial component. This study developed four spatiotemporal models with varying complexity due to the different temporal treatments such as (I) linear time trend; (II) quadratic time trend; (III) Autoregressive-1 (AR-1); and (IV) time adjacency. Moreover, the study introduced a flexible two-component mixture for the space-time interaction which allows greater flexibility compared to the traditional linear space-time interaction. The mixture component allows the accommodation of global space-time interaction as well as the departures from the overall spatial and temporal risk patterns. This study performed a comprehensive assessment of mixture models based on the diverse criteria pertaining to goodness-of-fit, cross-validation and evaluation based on in-sample data for predictive accuracy of crash estimates. The assessment of model performance in terms of goodness-of-fit clearly established the superiority of the time-adjacency specification which was evidently more complex due to the addition of information borrowed from neighboring years, but this addition of parameters allowed significant advantage at posterior deviance which subsequently benefited overall fit to crash data. The Base models were also developed to study the comparison between the proposed mixture and traditional space-time components for each temporal model. The mixture models consistently outperformed the corresponding Base models due to the advantages of much lower deviance. For cross-validation comparison of predictive accuracy, linear time trend model was adjudged the best as it recorded the highest value of log pseudo marginal likelihood (LPML). Four other evaluation criteria were considered for typical validation using the same data for model development. Under each criterion, observed crash counts were compared with three types of data containing Bayesian estimated, normal predicted, and model replicated ones. The linear model again performed the best in most scenarios except one case of using model replicated data and two cases involving prediction without including random effects. These phenomena indicated the mediocre performance of linear trend when random effects were excluded for evaluation. This might be due to the flexible mixture space-time interaction which can efficiently absorb the residual variability escaping from the predictable part of the model. The comparison of Base and mixture models in terms of prediction accuracy further bolstered the superiority of the mixture models as the mixture ones generated more precise estimated crash counts across all four models, suggesting that the advantages associated with mixture component at model fit were transferable to prediction accuracy. Finally, the residual analysis demonstrated the consistently superior performance of random effect models which validates the importance of incorporating the correlation structures to account for unobserved heterogeneity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Computational procedure of optimal inventory model involving controllable backorder rate and variable lead time with defective units

    NASA Astrophysics Data System (ADS)

    Lee, Wen-Chuan; Wu, Jong-Wuu; Tsou, Hsin-Hui; Lei, Chia-Ling

    2012-10-01

    This article considers that the number of defective units in an arrival order is a binominal random variable. We derive a modified mixture inventory model with backorders and lost sales, in which the order quantity and lead time are decision variables. In our studies, we also assume that the backorder rate is dependent on the length of lead time through the amount of shortages and let the backorder rate be a control variable. In addition, we assume that the lead time demand follows a mixture of normal distributions, and then relax the assumption about the form of the mixture of distribution functions of the lead time demand and apply the minimax distribution free procedure to solve the problem. Furthermore, we develop an algorithm procedure to obtain the optimal ordering strategy for each case. Finally, three numerical examples are also given to illustrate the results.

  5. Effect of the NACA Injection Impeller on the Mixture Distribution of a Double-row Radial Aircraft Engine

    NASA Technical Reports Server (NTRS)

    Marble, Frank E.; Ritter, William K.; Miller, Mahlon A.

    1946-01-01

    For the normal range of engine power the impeller provided marked improvement over the standard spray-bar injection system. Mixture distribution at cruising was excellent, maximum cylinder temperatures were reduced about 30 degrees F, and general temperature distribution was improved. The uniform mixture distribution restored the normal response of cylinder temperature to mixture enrichment and it reduced the possibility of carburetor icing, while no serious loss in supercharger pressure rise resulted from injection of fuel near the impeller outlet. The injection impeller also furnished a convenient means of adding water to the charge mixture for internal cooling.

  6. Item selection via Bayesian IRT models.

    PubMed

    Arima, Serena

    2015-02-10

    With reference to a questionnaire that aimed to assess the quality of life for dysarthric speakers, we investigate the usefulness of a model-based procedure for reducing the number of items. We propose a mixed cumulative logit model, which is known in the psychometrics literature as the graded response model: responses to different items are modelled as a function of individual latent traits and as a function of item characteristics, such as their difficulty and their discrimination power. We jointly model the discrimination and the difficulty parameters by using a k-component mixture of normal distributions. Mixture components correspond to disjoint groups of items. Items that belong to the same groups can be considered equivalent in terms of both difficulty and discrimination power. According to decision criteria, we select a subset of items such that the reduced questionnaire is able to provide the same information that the complete questionnaire provides. The model is estimated by using a Bayesian approach, and the choice of the number of mixture components is justified according to information criteria. We illustrate the proposed approach on the basis of data that are collected for 104 dysarthric patients by local health authorities in Lecce and in Milan. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Reduced-order modellin for high-pressure transient flow of hydrogen-natural gas mixture

    NASA Astrophysics Data System (ADS)

    Agaie, Baba G.; Khan, Ilyas; Alshomrani, Ali Saleh; Alqahtani, Aisha M.

    2017-05-01

    In this paper the transient flow of hydrogen compressed-natural gas (HCNG) mixture which is also referred to as hydrogen-natural gas mixture in a pipeline is numerically computed using the reduced-order modelling technique. The study on transient conditions is important because the pipeline flows are normally in the unsteady state due to the sudden opening and closure of control valves, but most of the existing studies only analyse the flow in the steady-state conditions. The mathematical model consists in a set of non-linear conservation forms of partial differential equations. The objective of this paper is to improve the accuracy in the prediction of the HCNG transient flow parameters using the Reduced-Order Modelling (ROM). The ROM technique has been successfully used in single-gas and aerodynamic flow problems, the gas mixture has not been done using the ROM. The study is based on the velocity change created by the operation of the valves upstream and downstream the pipeline. Results on the flow characteristics, namely the pressure, density, celerity and mass flux are based on variations of the mixing ratio and valve reaction and actuation time; the ROM computational time cost advantage are also presented.

  8. Normal uniform mixture differential gene expression detection for cDNA microarrays

    PubMed Central

    Dean, Nema; Raftery, Adrian E

    2005-01-01

    Background One of the primary tasks in analysing gene expression data is finding genes that are differentially expressed in different samples. Multiple testing issues due to the thousands of tests run make some of the more popular methods for doing this problematic. Results We propose a simple method, Normal Uniform Differential Gene Expression (NUDGE) detection for finding differentially expressed genes in cDNA microarrays. The method uses a simple univariate normal-uniform mixture model, in combination with new normalization methods for spread as well as mean that extend the lowess normalization of Dudoit, Yang, Callow and Speed (2002) [1]. It takes account of multiple testing, and gives probabilities of differential expression as part of its output. It can be applied to either single-slide or replicated experiments, and it is very fast. Three datasets are analyzed using NUDGE, and the results are compared to those given by other popular methods: unadjusted and Bonferroni-adjusted t tests, Significance Analysis of Microarrays (SAM), and Empirical Bayes for microarrays (EBarrays) with both Gamma-Gamma and Lognormal-Normal models. Conclusion The method gives a high probability of differential expression to genes known/suspected a priori to be differentially expressed and a low probability to the others. In terms of known false positives and false negatives, the method outperforms all multiple-replicate methods except for the Gamma-Gamma EBarrays method to which it offers comparable results with the added advantages of greater simplicity, speed, fewer assumptions and applicability to the single replicate case. An R package called nudge to implement the methods in this paper will be made available soon at . PMID:16011807

  9. 76 FR 17762 - Regulations Governing the Performance of Actuarial Services Under the Employee Retirement Income...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-31

    ... following as may be appropriate in the particular case: (1) Normal cost; (2) accrued liability; (3) payment... Joint Board will address on a case-by-case basis situations involving the inability of the Executive..., communication skills, and business and general tax law. 3. Qualifying Program Requirement These regulations do...

  10. CFD Modeling of Helium Pressurant Effects on Cryogenic Tank Pressure Rise Rates in Normal Gravity

    NASA Technical Reports Server (NTRS)

    Grayson, Gary; Lopez, Alfredo; Chandler, Frank; Hastings, Leon; Hedayat, Ali; Brethour, James

    2007-01-01

    A recently developed computational fluid dynamics modeling capability for cryogenic tanks is used to simulate both self-pressurization from external heating and also depressurization from thermodynamic vent operation. Axisymmetric models using a modified version of the commercially available FLOW-3D software are used to simulate actual physical tests. The models assume an incompressible liquid phase with density that is a function of temperature only. A fully compressible formulation is used for the ullage gas mixture that contains both condensable vapor and a noncondensable gas component. The tests, conducted at the NASA Marshall Space Flight Center, include both liquid hydrogen and nitrogen in tanks with ullage gas mixtures of each liquid's vapor and helium. Pressure and temperature predictions from the model are compared to sensor measurements from the tests and a good agreement is achieved. This further establishes the accuracy of the developed FLOW-3D based modeling approach for cryogenic systems.

  11. The Regular Interaction Pattern among Odorants of the Same Type and Its Application in Odor Intensity Assessment.

    PubMed

    Yan, Luchun; Liu, Jiemin; Jiang, Shen; Wu, Chuandong; Gao, Kewei

    2017-07-13

    The olfactory evaluation function (e.g., odor intensity rating) of e-nose is always one of the most challenging issues in researches about odor pollution monitoring. But odor is normally produced by a set of stimuli, and odor interactions among constituents significantly influenced their mixture's odor intensity. This study investigated the odor interaction principle in odor mixtures of aldehydes and esters, respectively. Then, a modified vector model (MVM) was proposed and it successfully demonstrated the similarity of the odor interaction pattern among odorants of the same type. Based on the regular interaction pattern, unlike a determined empirical model only fit for a specific odor mixture in conventional approaches, the MVM distinctly simplified the odor intensity prediction of odor mixtures. Furthermore, the MVM also provided a way of directly converting constituents' chemical concentrations to their mixture's odor intensity. By combining the MVM with usual data-processing algorithm of e-nose, a new e-nose system was established for an odor intensity rating. Compared with instrumental analysis and human assessor, it exhibited accuracy well in both quantitative analysis (Pearson correlation coefficient was 0.999 for individual aldehydes ( n = 12), 0.996 for their binary mixtures ( n = 36) and 0.990 for their ternary mixtures ( n = 60)) and odor intensity assessment (Pearson correlation coefficient was 0.980 for individual aldehydes ( n = 15), 0.973 for their binary mixtures ( n = 24), and 0.888 for their ternary mixtures ( n = 25)). Thus, the observed regular interaction pattern is considered an important foundation for accelerating extensive application of olfactory evaluation in odor pollution monitoring.

  12. On The Value at Risk Using Bayesian Mixture Laplace Autoregressive Approach for Modelling the Islamic Stock Risk Investment

    NASA Astrophysics Data System (ADS)

    Miftahurrohmah, Brina; Iriawan, Nur; Fithriasari, Kartika

    2017-06-01

    Stocks are known as the financial instruments traded in the capital market which have a high level of risk. Their risks are indicated by their uncertainty of their return which have to be accepted by investors in the future. The higher the risk to be faced, the higher the return would be gained. Therefore, the measurements need to be made against the risk. Value at Risk (VaR) as the most popular risk measurement method, is frequently ignore when the pattern of return is not uni-modal Normal. The calculation of the risks using VaR method with the Normal Mixture Autoregressive (MNAR) approach has been considered. This paper proposes VaR method couple with the Mixture Laplace Autoregressive (MLAR) that would be implemented for analysing the first three biggest capitalization Islamic stock return in JII, namely PT. Astra International Tbk (ASII), PT. Telekomunikasi Indonesia Tbk (TLMK), and PT. Unilever Indonesia Tbk (UNVR). Parameter estimation is performed by employing Bayesian Markov Chain Monte Carlo (MCMC) approaches.

  13. Protanomaly-without-darkened-red is deuteranopia with rods

    PubMed Central

    Shevell, Steven K.; Sun, Yang; Neitz, Maureen

    2008-01-01

    The Rayleigh match, a color match between a mixture of 545+670 nm lights and 589 nm light in modern instruments, is the definitive measurement for the diagnosis of inherited red/green color defects. All trichromats, whether normal or anomalous, have a limited range of 545+670 nm mixtures they perceive to match 589 nm: a typical color-normal match-range is about 50–55% of 670 nm in the mixture (deutan mode), while deuteranomals have a range that includes mixtures with less 670 nm than normal and protanomals a range that includes mixtures with more 670 nm than normal. Further, the matching luminance of the 589 nm light for deuteranomals is the same as for normals but for protanomals is below normal. An example of an unexpected Rayleigh match, therefore, is a match range above normal (typical of protanomaly) and a normal luminance setting for 589 nm (typical of deuteranomaly), a match that Pickford (1950) called protanomaly “when the red end of the spectrum is not darkened”. In this case, Rayleigh matching does not yield a clear diagnosis. Aside from Pickford, we are aware of only one other report of a similar observer (Pokorny and Smith, 1981); this study predated modern genetic techniques that can reveal the cone photopigment(s) in the red/green range. We recently had the opportunity to conduct genetic and psychophysical tests on such an observer. Genetic results predict he is a deuteranope. His Rayleigh match is consistent with L cones and a contribution from rods. Further, with a rod-suppressing background, his Rayleigh match is characteristic of a single L-cone photopigment (deuteranopia). PMID:18423511

  14. Protanomaly without darkened red is deuteranopia with rods.

    PubMed

    Shevell, Steven K; Sun, Yang; Neitz, Maureen

    2008-11-01

    The Rayleigh match, a color match between a mixture of 545+670 nm lights and 589 nm light in modern instruments, is the definitive measurement for the diagnosis of inherited red-green color defects. All trichromats, whether normal or anomalous, have a limited range of 545+670 nm mixtures they perceive to match 589 nm: a typical color-normal match range is about 50-55% of 670 nm in the mixture (deutan mode), while deuteranomals have a range that includes mixtures with less 670 nm than normal and protanomals a range that includes mixtures with more 670 nm than normal. Further, the matching luminance of the 589 nm light for deuteranomals is the same as for normals but for protanomals is below normal. An example of an unexpected Rayleigh match, therefore, is a match range above normal (typical of protanomaly) and a normal luminance setting for 589 nm (typical of deuteranomaly), a match called protanomaly "when the red end of the spectrum is not darkened" [Pickford, R.W. (1950). Three pedigrees for color blindness. Nature, 165, 182.]. In this case, Rayleigh matching does not yield a clear diagnosis. Aside from Pickford, we are aware of only one other report of a similar observer [Pokorny, J., & Smith, V. C. (1981). A variant of red-green color defect. Vision Research, 21, 311-317]; this study predated modern genetic techniques that can reveal the cone photopigment(s) in the red-green range. We recently had the opportunity to conduct genetic and psychophysical tests on such an observer. Genetic results predict he is a deuteranope. His Rayleigh match is consistent with L cones and a contribution from rods. Further, with a rod-suppressing background, his Rayleigh match is characteristic of a single L-cone photopigment (deuteranopia).

  15. Mapping quantitative trait loci for binary trait in the F2:3 design.

    PubMed

    Zhu, Chengsong; Zhang, Yuan-Ming; Guo, Zhigang

    2008-12-01

    In the analysis of inheritance of quantitative traits with low heritability, an F(2:3) design that genotypes plants in F(2) and phenotypes plants in F(2:3) progeny is often used in plant genetics. Although statistical approaches for mapping quantitative trait loci (QTL) in the F(2:3) design have been well developed, those for binary traits of biological interest and economic importance are seldom addressed. In this study, an attempt was made to map binary trait loci (BTL) in the F(2:3) design. The fundamental idea was: the F(2) plants were genotyped, all phenotypic values of each F(2:3) progeny were measured for binary trait, and these binary trait values and the marker genotype informations were used to detect BTL under the penetrance and liability models. The proposed method was verified by a series of Monte-Carlo simulation experiments. These results showed that maximum likelihood approaches under the penetrance and liability models provide accurate estimates for the effects and the locations of BTL with high statistical power, even under of low heritability. Moreover, the penetrance model is as efficient as the liability model, and the F(2:3) design is more efficient than classical F(2) design, even though only a single progeny is collected from each F(2:3) family. With the maximum likelihood approaches under the penetrance and the liability models developed in this study, we can map binary traits as we can do for quantitative trait in the F(2:3) design.

  16. Personal Exposure to Mixtures of Volatile Organic Compounds: Modeling and Further Analysis of the RIOPA Data

    PubMed Central

    Batterman, Stuart; Su, Feng-Chiao; Li, Shi; Mukherjee, Bhramar; Jia, Chunrong

    2015-01-01

    INTRODUCTION Emission sources of volatile organic compounds (VOCs) are numerous and widespread in both indoor and outdoor environments. Concentrations of VOCs indoors typically exceed outdoor levels, and most people spend nearly 90% of their time indoors. Thus, indoor sources generally contribute the majority of VOC exposures for most people. VOC exposure has been associated with a wide range of acute and chronic health effects; for example, asthma, respiratory diseases, liver and kidney dysfunction, neurologic impairment, and cancer. Although exposures to most VOCs for most persons fall below health-based guidelines, and long-term trends show decreases in ambient emissions and concentrations, a subset of individuals experience much higher exposures that exceed guidelines. Thus, exposure to VOCs remains an important environmental health concern. The present understanding of VOC exposures is incomplete. With the exception of a few compounds, concentration and especially exposure data are limited; and like other environmental data, VOC exposure data can show multiple modes, low and high extreme values, and sometimes a large portion of data below method detection limits (MDLs). Field data also show considerable spatial or interpersonal variability, and although evidence is limited, temporal variability seems high. These characteristics can complicate modeling and other analyses aimed at risk assessment, policy actions, and exposure management. In addition to these analytic and statistical issues, exposure typically occurs as a mixture, and mixture components may interact or jointly contribute to adverse effects. However most pollutant regulations, guidelines, and studies remain focused on single compounds, and thus may underestimate cumulative exposures and risks arising from coexposures. In addition, the composition of VOC mixtures has not been thoroughly investigated, and mixture components show varying and complex dependencies. Finally, although many factors are known to affect VOC exposures, many personal, environmental, and socioeconomic determinants remain to be identified, and the significance and applicability of the determinants reported in the literature are uncertain. To help answer these unresolved questions and overcome limitations of previous analyses, this project used several novel and powerful statistical modeling and analysis techniques and two large data sets. The overall objectives of this project were (1) to identify and characterize exposure distributions (including extreme values), (2) evaluate mixtures (including dependencies), and (3) identify determinants of VOC exposure. METHODS VOC data were drawn from two large data sets: the Relationships of Indoor, Outdoor, and Personal Air (RIOPA) study (1999–2001) and the National Health and Nutrition Examination Survey (NHANES; 1999–2000). The RIOPA study used a convenience sample to collect outdoor, indoor, and personal exposure measurements in three cities (Elizabeth, NJ; Houston, TX; Los Angeles, CA). In each city, approximately 100 households with adults and children who did not smoke were sampled twice for 18 VOCs. In addition, information about 500 variables associated with exposure was collected. The NHANES used a nationally representative sample and included personal VOC measurements for 851 participants. NHANES sampled 10 VOCs in common with RIOPA. Both studies used similar sampling methods and study periods. Specific Aim 1 To estimate and model extreme value exposures, extreme value distribution models were fitted to the top 10% and 5% of VOC exposures. Health risks were estimated for individual VOCs and for three VOC mixtures. Simulated extreme value data sets, generated for each VOC and for fitted extreme value and lognormal distributions, were compared with measured concentrations (RIOPA observations) to evaluate each model’s goodness of fit. Mixture distributions were fitted with the conventional finite mixture of normal distributions and the semi-parametric Dirichlet process mixture (DPM) of normal distributions for three individual VOCs (chloroform, 1,4-DCB, and styrene). Goodness of fit for these full distribution models was also evaluated using simulated data. Specific Aim 2 Mixtures in the RIOPA VOC data set were identified using positive matrix factorization (PMF) and by toxicologic mode of action. Dependency structures of a mixture’s components were examined using mixture fractions and were modeled using copulas, which address correlations of multiple components across their entire distributions. Five candidate copulas (Gaussian, t, Gumbel, Clayton, and Frank) were evaluated, and the performance of fitted models was evaluated using simulation and mixture fractions. Cumulative cancer risks were calculated for mixtures, and results from copulas and multivariate lognormal models were compared with risks based on RIOPA observations. Specific Aim 3 Exposure determinants were identified using stepwise regressions and linear mixed-effects models (LMMs). RESULTS Specific Aim 1 Extreme value exposures in RIOPA typically were best fitted by three-parameter generalized extreme value (GEV) distributions, and sometimes by the two-parameter Gumbel distribution. In contrast, lognormal distributions significantly underestimated both the level and likelihood of extreme values. Among the VOCs measured in RIOPA, 1,4-dichlorobenzene (1,4-DCB) was associated with the greatest cancer risks; for example, for the highest 10% of measurements of 1,4-DCB, all individuals had risk levels above 10−4, and 13% of all participants had risk levels above 10−2. Of the full-distribution models, the finite mixture of normal distributions with two to four clusters and the DPM of normal distributions had superior performance in comparison with the lognormal models. DPM distributions provided slightly better fit than the finite mixture distributions; the advantages of the DPM model were avoiding certain convergence issues associated with the finite mixture distributions, adaptively selecting the number of needed clusters, and providing uncertainty estimates. Although the results apply to the RIOPA data set, GEV distributions and mixture models appear more broadly applicable. These models can be used to simulate VOC distributions, which are neither normally nor lognormally distributed, and they accurately represent the highest exposures, which may have the greatest health significance. Specific Aim 2 Four VOC mixtures were identified and apportioned by PMF; they represented gasoline vapor, vehicle exhaust, chlorinated solvents and disinfection byproducts, and cleaning products and odorants. The last mixture (cleaning products and odorants) accounted for the largest fraction of an individual’s total exposure (average of 42% across RIOPA participants). Often, a single compound dominated a mixture but the mixture fractions were heterogeneous; that is, the fractions of the compounds changed with the concentration of the mixture. Three VOC mixtures were identified by toxicologic mode of action and represented VOCs associated with hematopoietic, liver, and renal tumors. Estimated lifetime cumulative cancer risks exceeded 10−3 for about 10% of RIOPA participants. The dependency structures of the VOC mixtures in the RIOPA data set fitted Gumbel (two mixtures) and t copulas (four mixtures). These copula types emphasize dependencies found in the upper and lower tails of a distribution. The copulas reproduced both risk predictions and exposure fractions with a high degree of accuracy and performed better than multivariate lognormal distributions. Specific Aim 3 In an analysis focused on the home environment and the outdoor (close to home) environment, home VOC concentrations dominated personal exposures (66% to 78% of the total exposure, depending on VOC); this was largely the result of the amount of time participants spent at home and the fact that indoor concentrations were much higher than outdoor concentrations for most VOCs. In a different analysis focused on the sources inside the home and outside (but close to the home), it was assumed that 100% of VOCs from outside sources would penetrate the home. Outdoor VOC sources accounted for 5% (d-limonene) to 81% (carbon tetrachloride [CTC]) of the total exposure. Personal exposure and indoor measurements had similar determinants depending on the VOC. Gasoline-related VOCs (e.g., benzene and methyl tert-butyl ether [MTBE]) were associated with city, residences with attached garages, pumping gas, wind speed, and home air exchange rate (AER). Odorant and cleaning-related VOCs (e.g., 1,4-DCB and chloroform) also were associated with city, and a residence’s AER, size, and family members showering. Dry-cleaning and industry-related VOCs (e.g., tetrachloroethylene [or perchloroethylene, PERC] and trichloroethylene [TCE]) were associated with city, type of water supply to the home, and visits to the dry cleaner. These and other relationships were significant, they explained from 10% to 40% of the variance in the measurements, and are consistent with known emission sources and those reported in the literature. Outdoor concentrations of VOCs had only two determinants in common: city and wind speed. Overall, personal exposure was dominated by the home setting, although a large fraction of indoor VOC concentrations were due to outdoor sources. City of residence, personal activities, household characteristics, and meteorology were significant determinants. Concentrations in RIOPA were considerably lower than levels in the nationally representative NHANES for all VOCs except MTBE and 1,4-DCB. Differences between RIOPA and NHANES results can be explained by contrasts between the sampling designs and staging in the two studies, and by differences in the demographics, smoking, employment, occupations, and home locations. A portion of these differences are due to the nature of the convenience (RIOPA) and representative (NHANES) sampling strategies used in the two studies. CONCLUSIONS Accurate models for exposure data, which can feature extreme values, multiple modes, data below the MDL, heterogeneous interpollutant dependency structures, and other complex characteristics, are needed to estimate exposures and risks and to develop control and management guidelines and policies. Conventional and novel statistical methods were applied to data drawn from two large studies to understand the nature and significance of VOC exposures. Both extreme value distributions and mixture models were found to provide excellent fit to single VOC compounds (univariate distributions), and copulas may be the method of choice for VOC mixtures (multivariate distributions), especially for the highest exposures, which fit parametric models poorly and which may represent the greatest health risk. The identification of exposure determinants, including the influence of both certain activities (e.g., pumping gas) and environments (e.g., residences), provides information that can be used to manage and reduce exposures. The results obtained using the RIOPA data set add to our understanding of VOC exposures and further investigations using a more representative population and a wider suite of VOCs are suggested to extend and generalize results. PMID:25145040

  17. 7 CFR 1400.204 - Limited partnerships, limited liability partnerships, limited liability companies, corporations...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Limited partnerships, limited liability partnerships..., limited liability partnerships, limited liability companies, corporations, and other similar legal entities. (a) A limited partnership, limited liability partnership, limited liability company, corporation...

  18. A Bootstrap Algorithm for Mixture Models and Interval Data in Inter-Comparisons

    DTIC Science & Technology

    2001-07-01

    parametric bootstrap. The present algorithm will be applied to a thermometric inter-comparison, where data cannot be assumed to be normally distributed. 2 Data...experimental methods, used in each laboratory) often imply that the statistical assumptions are not satisfied, as for example in several thermometric ...triangular). Indeed, in thermometric experiments these three probabilistic models can represent several common stochastic variabilities for

  19. The propulsive capability of explosives heavily loaded with inert materials

    NASA Astrophysics Data System (ADS)

    Loiseau, J.; Georges, W.; Frost, D. L.; Higgins, A. J.

    2018-01-01

    The effect of inert dilution on the accelerating ability of high explosives for both grazing and normal detonations was studied. The explosives considered were: (1) neat, amine-sensitized nitromethane (NM), (2) packed beds of glass, steel, or tungsten particles saturated with amine-sensitized NM, (3) NM gelled with PMMA containing dispersed glass microballoons, (4) NM gelled with PMMA containing glass microballoons and steel particles, and (5) C-4 containing varying mass fractions of glass or steel particles. Flyer velocity was measured via photonic Doppler velocimetry, and the results were analysed using a Gurney model augmented to include the influence of the diluent. Reduction in accelerating ability with increasing dilution for the amine-sensitized NM, gelled NM, and C-4 was measured experimentally. Variation of flyer terminal velocity with the ratio of flyer mass to charge mass (M/C) was measured for both grazing and normally incident detonations in gelled NM containing 10% microballoons by mass and for steel beads saturated with amine-sensitized NM. Finally, flyer velocity was measured in grazing versus normal loading for a number of explosive admixtures. The augmented Gurney model predicted the effect of dilution on accelerating ability and the scaling of flyer velocity with M/C for mixtures containing low-density diluents. The augmented Gurney model failed to predict the scaling of flyer velocity with M/C for mixtures heavily loaded with dense diluents. In all cases, normally incident detonations propelled flyers to higher velocity than the equivalent grazing detonations because of material velocity imparted by the incident shock wave and momentum/energy transfer from the slapper used to uniformly initiate the charge.

  20. The propulsive capability of explosives heavily loaded with inert materials

    NASA Astrophysics Data System (ADS)

    Loiseau, J.; Georges, W.; Frost, D. L.; Higgins, A. J.

    2018-07-01

    The effect of inert dilution on the accelerating ability of high explosives for both grazing and normal detonations was studied. The explosives considered were: (1) neat, amine-sensitized nitromethane (NM), (2) packed beds of glass, steel, or tungsten particles saturated with amine-sensitized NM, (3) NM gelled with PMMA containing dispersed glass microballoons, (4) NM gelled with PMMA containing glass microballoons and steel particles, and (5) C-4 containing varying mass fractions of glass or steel particles. Flyer velocity was measured via photonic Doppler velocimetry, and the results were analysed using a Gurney model augmented to include the influence of the diluent. Reduction in accelerating ability with increasing dilution for the amine-sensitized NM, gelled NM, and C-4 was measured experimentally. Variation of flyer terminal velocity with the ratio of flyer mass to charge mass ( M/ C) was measured for both grazing and normally incident detonations in gelled NM containing 10% microballoons by mass and for steel beads saturated with amine-sensitized NM. Finally, flyer velocity was measured in grazing versus normal loading for a number of explosive admixtures. The augmented Gurney model predicted the effect of dilution on accelerating ability and the scaling of flyer velocity with M/ C for mixtures containing low-density diluents. The augmented Gurney model failed to predict the scaling of flyer velocity with M/ C for mixtures heavily loaded with dense diluents. In all cases, normally incident detonations propelled flyers to higher velocity than the equivalent grazing detonations because of material velocity imparted by the incident shock wave and momentum/energy transfer from the slapper used to uniformly initiate the charge.

  1. Telemetry video-electroencephalography (EEG) in rats, dogs and non-human primates: methods in follow-up safety pharmacology seizure liability assessments.

    PubMed

    Bassett, Leanne; Troncy, Eric; Pouliot, Mylene; Paquette, Dominique; Ascah, Alexis; Authier, Simon

    2014-01-01

    Non-clinical seizure liability studies typically aim to: 1) confirm the nature of EEG activity during abnormal clinical signs, 2) identify premonitory clinical signs, 3) measure plasma levels at seizure onset, 4) demonstrate that drug-induced seizures are self-limiting, 5) confirm that conventional drugs (e.g. diazepam) can treat drug-induced seizures and 6) confirm the no observed adverse effect level (NOAEL) at EEG. Our aim was to originally characterize several of these items in a three species comparative study. Cynomolgus monkey, Beagle dog and Sprague-Dawley rat with EEG telemetry transmitters were used to obtain EEG using the 10-20 system. Pentylenetetrazol (PTZ) was used to determine seizure threshold or as a positive seizurogenic agent. Clinical signs were recorded and premonitory signs were evaluated. In complement, other pharmacological agents were used to illustrate various safety testing strategies. Intravenous PTZ doses required to induce clonic convulsions were 36.1 (3.8), 56.1 (12.7) and 49.4 (11.7) mg/kg, in Beagle dogs, cynomolgus monkeys and Sprague-Dawley rats, respectively. Premonitory clinical signs typically included decreased physical activity, enhanced physiological tremors, hypersalivation, ataxia, emesis (except in rats) and myoclonus. In Sprague-Dawley rats, amphetamine (PO) increased high (approximately 40-120Hz), and decreased low (1-14Hz) frequencies. In cynomolgus monkeys, caffeine (IM) increased power in high (14-127Hz), and attenuated power in low (1-13Hz) frequencies. In the rat PTZ infusion seizure threshold model, yohimbine (SC and IV) and phenobarbital (IP) confirmed to be reliable positive controls as pro- and anticonvulsants, respectively. Telemetry video-EEG for seizure liability investigations was characterized in three species. Rats represent a first-line model in seizure liability assessments. Beagle dogs are often associated with overt susceptibility to seizure and are typically used in seizure liability studies only if required by regulators. Non-human primates represent an important model in seizure liability assessments given similarities to humans and a high translational potential. Copyright © 2014. Published by Elsevier Inc.

  2. Evidence for a Heritable Brain Basis to Deviance-Promoting Deficits in Self-Control.

    PubMed

    Yancey, James R; Venables, Noah C; Hicks, Brian M; Patrick, Christopher J

    2013-01-01

    Classic criminological theories emphasize the role of impaired self-control in behavioral deviancy. Reduced amplitude of the P300 brain response is reliably observed in individuals with antisocial and substance-related problems, suggesting it may serve as a neurophysiological indicator of deficiencies in self-control that confer liability to deviancy. The current study evaluated the role of self-control capacity - operationalized by scores on a scale measure of trait disinhibition - in mediating the relationship between P300 brain response and behavioral deviancy in a sample of adult twins ( N =419) assessed for symptoms of antisocial/addictive disorders and P300 brain response. As predicted, greater disorder symptoms and higher trait disinhibition scores each predicted smaller P300 amplitude, and trait disinhibition mediated observed relations between antisocial/addictive disorders and P300 response. Further, twin modeling analyses revealed that trait disinhibition scores and disorder symptoms reflected a common genetic liability, and this genetic liability largely accounted for the observed phenotypic relationship between antisocial-addictive problems and P300 brain response. These results provide further evidence that heritable weaknesses in self-control capacity confer liability to antisocial/addictive outcomes and that P300 brain response indexes this dispositional liability.

  3. Dynamics of Aqueous Foam Drops

    NASA Technical Reports Server (NTRS)

    Akhatov, Iskander; McDaniel, J. Gregory; Holt, R. Glynn

    2001-01-01

    We develop a model for the nonlinear oscillations of spherical drops composed of aqueous foam. Beginning with a simple mixture law, and utilizing a mass-conserving bubble-in-cell scheme, we obtain a Rayleigh-Plesset-like equation for the dynamics of bubbles in a foam mixture. The dispersion relation for sound waves in a bubbly liquid is then coupled with a normal modes expansion to derive expressions for the frequencies of eigenmodal oscillations. These eigenmodal (breathing plus higher-order shape modes) frequencies are elicited as a function of the void fraction of the foam. A Mathieu-like equation is obtained for the dynamics of the higher-order shape modes and their parametric coupling to the breathing mode. The proposed model is used to explain recently obtained experimental data.

  4. The CLASSY clustering algorithm: Description, evaluation, and comparison with the iterative self-organizing clustering system (ISOCLS). [used for LACIE data

    NASA Technical Reports Server (NTRS)

    Lennington, R. K.; Malek, H.

    1978-01-01

    A clustering method, CLASSY, was developed, which alternates maximum likelihood iteration with a procedure for splitting, combining, and eliminating the resulting statistics. The method maximizes the fit of a mixture of normal distributions to the observed first through fourth central moments of the data and produces an estimate of the proportions, means, and covariances in this mixture. The mathematical model which is the basic for CLASSY and the actual operation of the algorithm is described. Data comparing the performances of CLASSY and ISOCLS on simulated and actual LACIE data are presented.

  5. 14 CFR 23.1147 - Mixture controls.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Mixture controls. 23.1147 Section 23.1147... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Powerplant Powerplant Controls and Accessories § 23.1147 Mixture controls. (a) If there are mixture controls, each engine must have a separate...

  6. 14 CFR 23.1147 - Mixture controls.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Mixture controls. 23.1147 Section 23.1147... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Powerplant Powerplant Controls and Accessories § 23.1147 Mixture controls. (a) If there are mixture controls, each engine must have a separate...

  7. Mixture modeling methods for the assessment of normal and abnormal personality, part I: cross-sectional models.

    PubMed

    Hallquist, Michael N; Wright, Aidan G C

    2014-01-01

    Over the past 75 years, the study of personality and personality disorders has been informed considerably by an impressive array of psychometric instruments. Many of these tests draw on the perspective that personality features can be conceptualized in terms of latent traits that vary dimensionally across the population. A purely trait-oriented approach to personality, however, might overlook heterogeneity that is related to similarities among subgroups of people. This article describes how factor mixture modeling (FMM), which incorporates both categories and dimensions, can be used to represent person-oriented and trait-oriented variability in the latent structure of personality. We provide an overview of different forms of FMM that vary in the degree to which they emphasize trait- versus person-oriented variability. We also provide practical guidelines for applying FMM to personality data, and we illustrate model fitting and interpretation using an empirical analysis of general personality dysfunction.

  8. Estimating Mixture of Gaussian Processes by Kernel Smoothing

    PubMed Central

    Huang, Mian; Li, Runze; Wang, Hansheng; Yao, Weixin

    2014-01-01

    When the functional data are not homogeneous, e.g., there exist multiple classes of functional curves in the dataset, traditional estimation methods may fail. In this paper, we propose a new estimation procedure for the Mixture of Gaussian Processes, to incorporate both functional and inhomogeneous properties of the data. Our method can be viewed as a natural extension of high-dimensional normal mixtures. However, the key difference is that smoothed structures are imposed for both the mean and covariance functions. The model is shown to be identifiable, and can be estimated efficiently by a combination of the ideas from EM algorithm, kernel regression, and functional principal component analysis. Our methodology is empirically justified by Monte Carlo simulations and illustrated by an analysis of a supermarket dataset. PMID:24976675

  9. 14 CFR 27.1147 - Mixture controls.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Mixture controls. 27.1147 Section 27.1147... STANDARDS: NORMAL CATEGORY ROTORCRAFT Powerplant Powerplant Controls and Accessories § 27.1147 Mixture controls. If there are mixture controls, each engine must have a separate control and the controls must be...

  10. Impact of distance-based metric learning on classification and visualization model performance and structure-activity landscapes.

    PubMed

    Kireeva, Natalia V; Ovchinnikova, Svetlana I; Kuznetsov, Sergey L; Kazennov, Andrey M; Tsivadze, Aslan Yu

    2014-02-01

    This study concerns large margin nearest neighbors classifier and its multi-metric extension as the efficient approaches for metric learning which aimed to learn an appropriate distance/similarity function for considered case studies. In recent years, many studies in data mining and pattern recognition have demonstrated that a learned metric can significantly improve the performance in classification, clustering and retrieval tasks. The paper describes application of the metric learning approach to in silico assessment of chemical liabilities. Chemical liabilities, such as adverse effects and toxicity, play a significant role in drug discovery process, in silico assessment of chemical liabilities is an important step aimed to reduce costs and animal testing by complementing or replacing in vitro and in vivo experiments. Here, to our knowledge for the first time, a distance-based metric learning procedures have been applied for in silico assessment of chemical liabilities, the impact of metric learning on structure-activity landscapes and predictive performance of developed models has been analyzed, the learned metric was used in support vector machines. The metric learning results have been illustrated using linear and non-linear data visualization techniques in order to indicate how the change of metrics affected nearest neighbors relations and descriptor space.

  11. Impact of distance-based metric learning on classification and visualization model performance and structure-activity landscapes

    NASA Astrophysics Data System (ADS)

    Kireeva, Natalia V.; Ovchinnikova, Svetlana I.; Kuznetsov, Sergey L.; Kazennov, Andrey M.; Tsivadze, Aslan Yu.

    2014-02-01

    This study concerns large margin nearest neighbors classifier and its multi-metric extension as the efficient approaches for metric learning which aimed to learn an appropriate distance/similarity function for considered case studies. In recent years, many studies in data mining and pattern recognition have demonstrated that a learned metric can significantly improve the performance in classification, clustering and retrieval tasks. The paper describes application of the metric learning approach to in silico assessment of chemical liabilities. Chemical liabilities, such as adverse effects and toxicity, play a significant role in drug discovery process, in silico assessment of chemical liabilities is an important step aimed to reduce costs and animal testing by complementing or replacing in vitro and in vivo experiments. Here, to our knowledge for the first time, a distance-based metric learning procedures have been applied for in silico assessment of chemical liabilities, the impact of metric learning on structure-activity landscapes and predictive performance of developed models has been analyzed, the learned metric was used in support vector machines. The metric learning results have been illustrated using linear and non-linear data visualization techniques in order to indicate how the change of metrics affected nearest neighbors relations and descriptor space.

  12. Decision support system and medical liability.

    PubMed Central

    Allaërt, F. A.; Dusserre, L.

    1992-01-01

    Expert systems, which are going to be an essential tool in Medicine, are evolving in terms of sophistication of both knowledge representation and types of reasoning models used. The more efficient they are, the more often they will be used and professional liability will be involved. So after giving a short survey of configuration and working of expert systems, the authors will study the liabilities of people building and the using expert systems regarding some various dysfunctions. Of course the expert systems have to be considered only for human support and they should not possess any authority themselves, therefore the doctors must keep in mind that it is their own responsibility and as such keep their judgment and criticism. However other professionals could be involved, if they have participated in the building of expert systems. The different liabilities and the burden of proof are discussed according to some possible dysfunctions. In any case the final proof is inside the expert system by itself through re-computation of data. PMID:1482972

  13. Technological Advances in Cardiovascular Safety Assessment Decrease Preclinical Animal Use and Improve Clinical Relevance.

    PubMed

    Berridge, Brian R; Schultze, A Eric; Heyen, Jon R; Searfoss, George H; Sarazan, R Dustan

    2016-12-01

    Cardiovascular (CV) safety liabilities are significant concerns for drug developers and preclinical animal studies are predominately where those liabilities are characterized before patient exposures. Steady progress in technology and laboratory capabilities is enabling a more refined and informative use of animals in those studies. The application of surgically implantable and telemetered instrumentation in the acute assessment of drug effects on CV function has significantly improved historical approaches that involved anesthetized or restrained animals. More chronically instrumented animals and application of common clinical imaging assessments like echocardiography and MRI extend functional and in-life structural assessments into the repeat-dose setting. A growing portfolio of circulating CV biomarkers is allowing longitudinal and repeated measures of cardiac and vascular injury and dysfunction better informing an understanding of temporal pathogenesis and allowing earlier detection of undesirable effects. In vitro modeling systems of the past were limited by their lack of biological relevance to the in vivo human condition. Advances in stem cell technology and more complex in vitro modeling platforms are quickly creating more opportunity to supplant animals in our earliest assessments for liabilities. Continuing improvement in our capabilities in both animal and nonanimal modeling should support a steady decrease in animal use for primary liability identification and optimize the translational relevance of the animal studies we continue to do. © The Author 2016. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  14. Use of an Amino Acid Mixture in Treatment of Phenylketonuria

    PubMed Central

    Bentovim, A.; Clayton, Barbara E.; Francis, Dorothy E. M.; Shepherd, Jean; Wolff, O. H.

    1970-01-01

    Twelve children with phenylketonuria diagnosed and treated from the first few weeks of life were grouped into pairs. Before the trial all of them were receiving a commercial preparation containing a protein hydrolysate low in phenylalanine (Cymogran, Allen and Hanburys Ltd.) as a substitute for natural protein. One of each pair was given an amino acid mixture instead of Cymogran for about 6 months. Use of the mixture involved considerable modification of the diet, and in particular the inclusion of greater amounts of phenylalanine-free foods. All six accepted the new mixture without difficulty, food problems were greatly reduced, parents welcomed the new preparation, and the quality of family life improved. Normal growth was maintained and with a mixture of l amino acids the plasma and urinary amino acid levels were normal. Further studies are needed before the mixture can be recommended for children under 20 months of age. PMID:5477678

  15. Unsupervised Gaussian Mixture-Model With Expectation Maximization for Detecting Glaucomatous Progression in Standard Automated Perimetry Visual Fields.

    PubMed

    Yousefi, Siamak; Balasubramanian, Madhusudhanan; Goldbaum, Michael H; Medeiros, Felipe A; Zangwill, Linda M; Weinreb, Robert N; Liebmann, Jeffrey M; Girkin, Christopher A; Bowd, Christopher

    2016-05-01

    To validate Gaussian mixture-model with expectation maximization (GEM) and variational Bayesian independent component analysis mixture-models (VIM) for detecting glaucomatous progression along visual field (VF) defect patterns (GEM-progression of patterns (POP) and VIM-POP). To compare GEM-POP and VIM-POP with other methods. GEM and VIM models separated cross-sectional abnormal VFs from 859 eyes and normal VFs from 1117 eyes into abnormal and normal clusters. Clusters were decomposed into independent axes. The confidence limit (CL) of stability was established for each axis with a set of 84 stable eyes. Sensitivity for detecting progression was assessed in a sample of 83 eyes with known progressive glaucomatous optic neuropathy (PGON). Eyes were classified as progressed if any defect pattern progressed beyond the CL of stability. Performance of GEM-POP and VIM-POP was compared to point-wise linear regression (PLR), permutation analysis of PLR (PoPLR), and linear regression (LR) of mean deviation (MD), and visual field index (VFI). Sensitivity and specificity for detecting glaucomatous VFs were 89.9% and 93.8%, respectively, for GEM and 93.0% and 97.0%, respectively, for VIM. Receiver operating characteristic (ROC) curve areas for classifying progressed eyes were 0.82 for VIM-POP, 0.86 for GEM-POP, 0.81 for PoPLR, 0.69 for LR of MD, and 0.76 for LR of VFI. GEM-POP was significantly more sensitive to PGON than PoPLR and linear regression of MD and VFI in our sample, while providing localized progression information. Detection of glaucomatous progression can be improved by assessing longitudinal changes in localized patterns of glaucomatous defect identified by unsupervised machine learning.

  16. Selecting statistical model and optimum maintenance policy: a case study of hydraulic pump.

    PubMed

    Ruhi, S; Karim, M R

    2016-01-01

    Proper maintenance policy can play a vital role for effective investigation of product reliability. Every engineered object such as product, plant or infrastructure needs preventive and corrective maintenance. In this paper we look at a real case study. It deals with the maintenance of hydraulic pumps used in excavators by a mining company. We obtain the data that the owner had collected and carry out an analysis and building models for pump failures. The data consist of both failure and censored lifetimes of the hydraulic pump. Different competitive mixture models are applied to analyze a set of maintenance data of a hydraulic pump. Various characteristics of the mixture models, such as the cumulative distribution function, reliability function, mean time to failure, etc. are estimated to assess the reliability of the pump. Akaike Information Criterion, adjusted Anderson-Darling test statistic, Kolmogrov-Smirnov test statistic and root mean square error are considered to select the suitable models among a set of competitive models. The maximum likelihood estimation method via the EM algorithm is applied mainly for estimating the parameters of the models and reliability related quantities. In this study, it is found that a threefold mixture model (Weibull-Normal-Exponential) fits well for the hydraulic pump failures data set. This paper also illustrates how a suitable statistical model can be applied to estimate the optimum maintenance period at a minimum cost of a hydraulic pump.

  17. Determining prescription durations based on the parametric waiting time distribution.

    PubMed

    Støvring, Henrik; Pottegård, Anton; Hallas, Jesper

    2016-12-01

    The purpose of the study is to develop a method to estimate the duration of single prescriptions in pharmacoepidemiological studies when the single prescription duration is not available. We developed an estimation algorithm based on maximum likelihood estimation of a parametric two-component mixture model for the waiting time distribution (WTD). The distribution component for prevalent users estimates the forward recurrence density (FRD), which is related to the distribution of time between subsequent prescription redemptions, the inter-arrival density (IAD), for users in continued treatment. We exploited this to estimate percentiles of the IAD by inversion of the estimated FRD and defined the duration of a prescription as the time within which 80% of current users will have presented themselves again. Statistical properties were examined in simulation studies, and the method was applied to empirical data for four model drugs: non-steroidal anti-inflammatory drugs (NSAIDs), warfarin, bendroflumethiazide, and levothyroxine. Simulation studies found negligible bias when the data-generating model for the IAD coincided with the FRD used in the WTD estimation (Log-Normal). When the IAD consisted of a mixture of two Log-Normal distributions, but was analyzed with a single Log-Normal distribution, relative bias did not exceed 9%. Using a Log-Normal FRD, we estimated prescription durations of 117, 91, 137, and 118 days for NSAIDs, warfarin, bendroflumethiazide, and levothyroxine, respectively. Similar results were found with a Weibull FRD. The algorithm allows valid estimation of single prescription durations, especially when the WTD reliably separates current users from incident users, and may replace ad-hoc decision rules in automated implementations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Medical teams and the standard of care in negligence.

    PubMed

    Sappideen, Carolyn

    2015-09-01

    Medical teams are essential to the delivery of modern, patient-centred health care in hospitals. A collective model of responsibility envisaged by team care is inconsistent with common law tort liability which focuses on the individual rather than the team. There is no basis upon which a team can be liable as a collective at common law. Nor does the common law'countenance liability for the conduct of other team members absent some form of agency, vicarious liability or non-delegable duty. Despite the barriers to the adoption of a team standard of care in negligence, there is scope for team factors to have a role in determining the standard of care so that being a team player is part and parcel of what it is to be a competent professional. If this is the case, the skill set, and the standard of care expected of the individual professional, includes skills based on team models of communication, cross-monitoring and trust.

  19. Evidence for a Heritable Brain Basis to Deviance-Promoting Deficits in Self-Control

    PubMed Central

    Yancey, James R.; Venables, Noah C.; Hicks, Brian M.; Patrick, Christopher J.

    2013-01-01

    Purpose Classic criminological theories emphasize the role of impaired self-control in behavioral deviancy. Reduced amplitude of the P300 brain response is reliably observed in individuals with antisocial and substance-related problems, suggesting it may serve as a neurophysiological indicator of deficiencies in self-control that confer liability to deviancy. Methods The current study evaluated the role of self-control capacity — operationalized by scores on a scale measure of trait disinhibition — in mediating the relationship between P300 brain response and behavioral deviancy in a sample of adult twins (N=419) assessed for symptoms of antisocial/addictive disorders and P300 brain response. Results As predicted, greater disorder symptoms and higher trait disinhibition scores each predicted smaller P300 amplitude, and trait disinhibition mediated observed relations between antisocial/addictive disorders and P300 response. Further, twin modeling analyses revealed that trait disinhibition scores and disorder symptoms reflected a common genetic liability, and this genetic liability largely accounted for the observed phenotypic relationship between antisocial-addictive problems and P300 brain response. Conclusions These results provide further evidence that heritable weaknesses in self-control capacity confer liability to antisocial/addictive outcomes and that P300 brain response indexes this dispositional liability. PMID:24187392

  20. A Preliminary Comparison of the Effectiveness of Cluster Analysis Weighting Procedures for Within-Group Covariance Structure.

    ERIC Educational Resources Information Center

    Donoghue, John R.

    A Monte Carlo study compared the usefulness of six variable weighting methods for cluster analysis. Data were 100 bivariate observations from 2 subgroups, generated according to a finite normal mixture model. Subgroup size, within-group correlation, within-group variance, and distance between subgroup centroids were manipulated. Of the clustering…

  1. Similar precipitated withdrawal effects on intracranial self-stimulation during chronic infusion of an e-cigarette liquid or nicotine alone.

    PubMed

    Harris, A C; Muelken, P; Smethells, J R; Krueger, M; LeSage, M G

    2017-10-01

    The FDA recently extended their regulatory authority to electronic cigarettes (ECs). Because the abuse liability of ECs is a leading concern of the FDA, animal models are urgently needed to identify factors that influence the relative abuse liability of these products. The ability of tobacco products to induce nicotine dependence, defined by the emergence of anhedonia and other symptoms of nicotine withdrawal following cessation of their use, contributes to tobacco abuse liability. The present study compared the severity of precipitated withdrawal during chronic infusion of nicotine alone or nicotine-dose equivalent concentrations of three different EC refill liquids in rats, as indicated by elevations in intracranial self-stimulation (ICSS) thresholds (anhedonia-like behavior). Because these EC liquids contain constituents that may enhance their abuse liability (e.g., minor alkaloids), we hypothesized that they would be associated with greater withdrawal effects than nicotine alone. Results indicated that the nicotinic acetylcholine receptor antagonist mecamylamine precipitated elevations in ICSS thresholds in rats receiving a chronic infusion of nicotine alone or EC liquids (3.2mg/kg/day, via osmotic pump). Magnitude of this effect did not differ between formulations. Our findings indicate that nicotine alone is the primary CNS determinant of the ability of ECs to engender dependence. Combined with our previous findings that nicotine alone and these EC liquids do not differ in other preclinical addiction models, these data suggest that product standards set by the FDA to reduce EC abuse liability should primarily target nicotine, other constituents with peripheral sensory effects (e.g. flavorants), and factors that influence product appeal (e.g., marketing). Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Soft Mixer Assignment in a Hierarchical Generative Model of Natural Scene Statistics

    PubMed Central

    Schwartz, Odelia; Sejnowski, Terrence J.; Dayan, Peter

    2010-01-01

    Gaussian scale mixture models offer a top-down description of signal generation that captures key bottom-up statistical characteristics of filter responses to images. However, the pattern of dependence among the filters for this class of models is prespecified. We propose a novel extension to the gaussian scale mixture model that learns the pattern of dependence from observed inputs and thereby induces a hierarchical representation of these inputs. Specifically, we propose that inputs are generated by gaussian variables (modeling local filter structure), multiplied by a mixer variable that is assigned probabilistically to each input from a set of possible mixers. We demonstrate inference of both components of the generative model, for synthesized data and for different classes of natural images, such as a generic ensemble and faces. For natural images, the mixer variable assignments show invariances resembling those of complex cells in visual cortex; the statistics of the gaussian components of the model are in accord with the outputs of divisive normalization models. We also show how our model helps interrelate a wide range of models of image statistics and cortical processing. PMID:16999575

  3. A Zero- and K-Inflated Mixture Model for Health Questionnaire Data

    PubMed Central

    Finkelman, Matthew D.; Green, Jennifer Greif; Gruber, Michael J.; Zaslavsky, Alan M.

    2011-01-01

    In psychiatric assessment, Item Response Theory (IRT) is a popular tool to formalize the relation between the severity of a disorder and associated responses to questionnaire items. Practitioners of IRT sometimes make the assumption of normally distributed severities within a population; while convenient, this assumption is often violated when measuring psychiatric disorders. Specifically, there may be a sizable group of respondents whose answers place them at an extreme of the latent trait spectrum. In this article, a zero- and K-inflated mixture model is developed to account for the presence of such respondents. The model is fitted using an expectation-maximization (E-M) algorithm to estimate the percentage of the population at each end of the continuum, concurrently analyzing the remaining “graded component” via IRT. A method to perform factor analysis for only the graded component is introduced. In assessments of oppositional defiant disorder and conduct disorder, the zero- and K-inflated model exhibited better fit than the standard IRT model. PMID:21365673

  4. Impact of Lead Time and Safety Factor in Mixed Inventory Models with Backorder Discounts

    NASA Astrophysics Data System (ADS)

    Lo, Ming-Cheng; Chao-Hsien Pan, Jason; Lin, Kai-Cing; Hsu, Jia-Wei

    This study investigates the impact of safety factor on the continuous review inventory model involving controllable lead time with mixture of backorder discount and partial lost sales. The objective is to minimize the expected total annual cost with respect to order quantity, backorder price discount, safety factor and lead time. A model with normal demand is also discussed. Numerical examples are presented to illustrate the procedures of the algorithms and the effects of parameters on the result of the proposed models are analyzed.

  5. Nutritional support contributes to recuperation in a rat model of aplastic anemia by enhancing mitochondrial function.

    PubMed

    Yang, Guang; Zhao, Lifen; Liu, Bing; Shan, Yujia; Li, Yang; Zhou, Huimin; Jia, Li

    2018-02-01

    Acquired aplastic anemia (AA) is a hematopoietic stem cell disease that leads to hematopoietic disorder and peripheral blood pancytopenia. We investigated whether nutritional support is helpful to AA recovery. We established a rat model with AA. A nutrient mixture was administered to rats with AA through different dose gavage once per day for 55 d. Animals in this study were assigned to one of five groups: normal control (NC; group includes normal rats); AA (rats with AA); high dose (AA + nutritional mixture, 2266.95 mg/kg/d); medium dose (1511.3 mg/kg/d); and low dose (1057.91 mg/kg/d). The effects of nutrition administration on general status and mitochondrial function of rats with AA were evaluated. The nutrient mixture with which the rats were supplemented significantly improved weight, peripheral blood parameters, and histologic parameters of rats with AA in a dose-dependent manner. Furthermore, we observed that the number of mitochondria in the liver, spleen, kidney, and brain was increased after supplementation by transmission electron microscopy analysis. Nutrient administration also improved mitochondrial DNA content, adenosine triphosphate content, and membrane potential but inhibited oxidative stress, thus, repairing the mitochondrial dysfunction of the rats with AA. Taken together, nutrition supplements may contribute to the improvement of mitochondrial function and play an important role in the recuperation of rats with AA. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. [Health promotion in gas industrial workers].

    PubMed

    Volodina, E P; Tiagnenko, V A; Novikov, I V

    2004-01-01

    Health promotion in the workers of the limited liability company "Astrakhangazprom" in their working places (without discontinuing work) with the complexes of nutrients (including omega-3) enriched with vitamins, macro- and microelements, made in Russia yielded a positive therapeutic effect in improving the health status, in normalizing and improving laboratory and instrumental data, and in reducing sick cases with temporary disability. The duration of a health promotion course was 2 months.

  7. Improving the developability profile of pyrrolidine progesterone receptor partial agonists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallander, Lara S.; Washburn, David G.; Hoang, Tram H.

    2010-09-17

    The previously reported pyrrolidine class of progesterone receptor partial agonists demonstrated excellent potency but suffered from serious liabilities including hERG blockade and high volume of distribution in the rat. The basic pyrrolidine amine was intentionally converted to a sulfonamide, carbamate, or amide to address these liabilities. The evaluation of the degree of partial agonism for these non-basic pyrrolidine derivatives and demonstration of their efficacy in an in vivo model of endometriosis is disclosed herein.

  8. The developmental association between eating disorders symptoms and symptoms of depression and anxiety in juvenile twin girls.

    PubMed

    Silberg, Judy L; Bulik, Cynthia M

    2005-12-01

    We investigated the role of genetic and environmental factors in the developmental association among symptoms of eating disorders, depression, and anxiety syndromes in 8-13-year-old and 14-17-year-old twin girls. Multivariate genetic models were fitted to child-reported longitudinal symptom data gathered from clinical interview on 408 MZ and 198 DZ female twin pairs from the Virginia Twin Study of Adolescent Behavioural Development (VTSABD). Model-fitting revealed distinct etiological patterns underlying the association among symptoms of eating disorders, depression, overanxious disorder (OAD), and separation anxiety disorder (SAD) during the course of development: 1) a common genetic factor influencing liability to all symptoms - of early and later OAD, depression, SAD, and eating symptoms; 2) a distinct genetic factor specifically indexing liability to early eating disorders symptoms; 3) a shared environmental factor specifically influencing early depression and early eating disorders symptoms; and 4) a common environmental factor affecting liability to symptoms of later eating disorders and both early and later separation anxiety. These results suggest a pervasive genetic effect that influences liability to symptoms of over-anxiety, separation anxiety, depression, and eating disorder throughout development, a shared environmental influence on later adolescent eating problems and persistent separation anxiety, genetic influences specific to early eating disorders symptoms, and a shared environmental factor influencing symptoms of early eating and depression.

  9. A mixture model-based approach to the clustering of microarray expression data.

    PubMed

    McLachlan, G J; Bean, R W; Peel, D

    2002-03-01

    This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets. EMMIX-GENE is available at http://www.maths.uq.edu.au/~gjm/emmix-gene/

  10. THE LIABILITY FORMS OF THE MEDICAL PERSONNEL.

    PubMed

    Bărcan, Cristian

    2015-01-01

    Current legislation, namely Law no. 95/2006 on healthcare reform in the medical malpractice domain stipulates that medical staff can be held accountable in the following forms: disciplinary liability, administrative liability, civil liability and criminal liability. Each form of legal liability presents its features, aspects that are found mainly in the procedural rules. However, the differences between the various legal forms of liability are not met only in the procedural rules but also in their effects and consequences. It is necessary to know what the procedure for disciplinary responsibility, administrative liability, civil liability, or criminal liability is. In addition to the differentiation determined by the consequences that may arise from the different forms of legal liability, it is important to know the competent authorities to investigate a case further and the solutions which various public institutions can take regarding the medical staff. Depending on the type of legal liability, authorities have a specialized authority. If the Disciplinary Committee is encountered at the College of Physicians, it may not intervene in cases before the monitoring and competence for malpractice cases Committee. The latter two committees cannot intervene directly in the legal assessment of civil or criminal cases, as no criminal investigation authorities cannot intervene in strictly civilian cases. Therefore, the importance of knowing the competent institutions is imperative.

  11. Motivation, emotion regulation, and the latent structure of psychopathology: An integrative and convergent historical perspective.

    PubMed

    Beauchaine, Theodore P; Zisner, Aimee

    2017-09-01

    Motivational models of psychopathology have long been advanced by psychophysiologists, and have provided key insights into neurobiological mechanisms of a wide range of psychiatric disorders. These accounts emphasize individual differences in activity and reactivity of bottom-up, subcortical neural systems of approach and avoidance in affecting behavior. Largely independent literatures emphasize the roles of top-down, cortical deficits in emotion regulation and executive function in conferring vulnerability to psychopathology. To date however, few models effectively integrate functions performed by bottom-up emotion generation system with those performed by top-down emotion regulation systems in accounting for alternative expressions of psychopathology. In this article, we present such a model, and describe how it accommodates the well replicated bifactor structure of psychopathology. We describe how excessive approach motivation maps directly into externalizing liability, how excessive passive avoidance motivation maps directly into internalizing liability, and how emotion dysregulation and executive function map onto general liability. This approach is consistent with the Research Domain Criteria initiative, which assumes that a limited number of brain systems interact to confer vulnerability to many if not most forms of psychopathology. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. 46 CFR 298.38 - Partnership agreements and limited liability company agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Partnership agreements and limited liability company... liability company agreements. Partnership and limited liability company agreements must be in form and...) Duration of the entity; (b) Adequate partnership or limited liability company funding requirements and...

  13. 29 CFR 4219.11 - Withdrawal liability upon mass withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Withdrawal liability upon mass withdrawal. 4219.11 Section... Redetermination of Withdrawal Liability Upon Mass Withdrawal § 4219.11 Withdrawal liability upon mass withdrawal. (a) Initial withdrawal liability. The plan sponsor of a multiemployer plan that experiences a mass...

  14. 29 CFR 4219.11 - Withdrawal liability upon mass withdrawal.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 9 2011-07-01 2011-07-01 false Withdrawal liability upon mass withdrawal. 4219.11 Section... Redetermination of Withdrawal Liability Upon Mass Withdrawal § 4219.11 Withdrawal liability upon mass withdrawal. (a) Initial withdrawal liability. The plan sponsor of a multiemployer plan that experiences a mass...

  15. Environmental liability and redevelopment of old industrial land.

    PubMed

    Sigman, Hilary

    2010-01-01

    Many communities are concerned about the reuse of potentially contaminated land (brownfields) and believe that environmental liability is a hindrance to redevelopment. However, with land price adjustments, liability might not impede the reuse of this land. This article studies state liability rules-specifically, strict liability and joint and several liability-that affect the level and distribution of expected costs of private cleanup. It explores the effects of this variation on industrial land prices and vacancy rates and on reported brownfields in a panel of cities across the United States. In the estimated equations, joint and several liability reduces land prices and increases vacancy rates in central cities. The results suggest that liability is at least partly capitalized but does still deter redevelopment.

  16. Sonographic and electrodiagnostic features of hereditary neuropathy with liability to pressure palsies.

    PubMed

    Ginanneschi, Federica; Filippou, Georgios; Giannini, Fabio; Carluccio, Maria A; Adinolfi, Antonella; Frediani, Bruno; Dotti, Maria T; Rossi, Alessandro

    2012-12-01

    In hereditary neuropathy with liability to pressure palsies (HNPP), the increase in distal motor latencies (DMLs) is often out of proportion to the slowing of conduction velocities, but the pathophysiological mechanism is still unclear. We used a combined electrophysiological and ultrasonographic (US) approach to provide insight into this issue. Twelve HNPP subjects underwent extensive electrophysiological studies and US measurements of the cross-sectional area (CSA) of several peripheral nerves. US nerve enlargement was only observed in the carpal tunnel, Guyon's canal, the elbow and the fibular head. We did not observe US abnormalities at sites where nerve entrapment is uncommon. An increase in DMLs was observed regardless of US nerve enlargement. The increased nerve CSA only in common sites of entrapment likely reflected the well-documented nerve vulnerability to mechanical stress in HNPP. No morphometric changes were seen in the distal nerve segments where compression/entrapment is unlikely, despite the fact that the DMLs were increased. These data suggest that factors other than mechanical stress are responsible for the distal slowing of action potential propagation. We speculate that a mixture of mechanical insults and an axon-initiated process in the distal nerves underlies the distal slowing and/or conduction failure in HNPP. © 2012 Peripheral Nerve Society.

  17. Mixture distributions of wind speed in the UAE

    NASA Astrophysics Data System (ADS)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.

  18. Pairwise mixture model for unmixing partial volume effect in multi-voxel MR spectroscopy of brain tumour patients

    NASA Astrophysics Data System (ADS)

    Olliverre, Nathan; Asad, Muhammad; Yang, Guang; Howe, Franklyn; Slabaugh, Gregory

    2017-03-01

    Multi-Voxel Magnetic Resonance Spectroscopy (MV-MRS) provides an important and insightful technique for the examination of the chemical composition of brain tissue, making it an attractive medical imaging modality for the examination of brain tumours. MRS, however, is affected by the issue of the Partial Volume Effect (PVE), where the signals of multiple tissue types can be found within a single voxel and provides an obstacle to the interpretation of the data. The PVE results from the low resolution achieved in MV-MRS images relating to the signal to noise ratio (SNR). To counteract PVE, this paper proposes a novel Pairwise Mixture Model (PMM), that extends a recently reported Signal Mixture Model (SMM) for representing the MV-MRS signal as normal, low or high grade tissue types. Inspired by Conditional Random Field (CRF) and its continuous variant the PMM incorporates the surrounding voxel neighbourhood into an optimisation problem, the solution of which provides an estimation to a set of coefficients. The values of the estimated coefficients represents the amount of each tissue type (normal, low or high) found within a voxel. These coefficients can then be visualised as a nosological rendering using a coloured grid representing the MV-MRS image overlaid on top of a structural image, such as a Magnetic Resonance Image (MRI). Experimental results show an accuracy of 92.69% in classifying patient tumours as either low or high grade compared against the histopathology for each patient. Compared to 91.96% achieved by the SMM, the proposed PMM method demonstrates the importance of incorporating spatial coherence into the estimation as well as its potential clinical usage.

  19. Evaluation of Thermodynamic Models for Predicting Phase Equilibria of CO2 + Impurity Binary Mixture

    NASA Astrophysics Data System (ADS)

    Shin, Byeong Soo; Rho, Won Gu; You, Seong-Sik; Kang, Jeong Won; Lee, Chul Soo

    2018-03-01

    For the design and operation of CO2 capture and storage (CCS) processes, equation of state (EoS) models are used for phase equilibrium calculations. Reliability of an EoS model plays a crucial role, and many variations of EoS models have been reported and continue to be published. The prediction of phase equilibria for CO2 mixtures containing SO2, N2, NO, H2, O2, CH4, H2S, Ar, and H2O is important for CO2 transportation because the captured gas normally contains small amounts of impurities even though it is purified in advance. For the design of pipelines in deep sea or arctic conditions, flow assurance and safety are considered priority issues, and highly reliable calculations are required. In this work, predictive Soave-Redlich-Kwong, cubic plus association, Groupe Européen de Recherches Gazières (GERG-2008), perturbed-chain statistical associating fluid theory, and non-random lattice fluids hydrogen bond EoS models were compared regarding performance in calculating phase equilibria of CO2-impurity binary mixtures and with the collected literature data. No single EoS could cover the entire range of systems considered in this study. Weaknesses and strong points of each EoS model were analyzed, and recommendations are given as guidelines for safe design and operation of CCS processes.

  20. High and low-risk specialties experience with the U.S. medical malpractice system

    PubMed Central

    2013-01-01

    Background “High-liability risk specialties” tend to be the focus of medical malpractice system research and debate, but concerns and fears are not limited to this group. The objective of this study was to examine whether “high-liability risk” medical specialties have a different experience with the malpractice system than “low-liability risk” specialties. Methods We reviewed claims data from the Physician Insurers Association of America’s Data Sharing Project between January 1985 and December 2008. We used linear regression, controlling for year, to determine how liability risk affected outcomes of interest. Results In high-liability risk specialties, 33% of claims result in indemnity payments compared to 28% for low-liability risk specialties (p < 0.001). The average indemnity payment for high-liability risk specialties was $315,314 compared to $267,146 for low-liability risk specialties (p = 0.25). Although only a small percentage of claims go to trial, low-liability risk specialties have significantly more claims that are ultimately dropped, withdrawn or dismissed, while high-liability risk specialties have significantly more claims that result in plaintiff settlement (p < 0.001). Conclusions Malpractice risk exists for all specialties. Variability in indemnity costs are found in both high- and low-liability risk specialties. Differences in the reasons for which claims are initiated for high- and low-liability risk specialties likely necessitate different risk management solutions. PMID:24192524

  1. Electron Transport Coefficients and Effective Ionization Coefficients in SF6-O2 and SF6-Air Mixtures Using Boltzmann Analysis

    NASA Astrophysics Data System (ADS)

    Wei, Linsheng; Xu, Min; Yuan, Dingkun; Zhang, Yafang; Hu, Zhaoji; Tan, Zhihong

    2014-10-01

    The electron drift velocity, electron energy distribution function (EEDF), density-normalized effective ionization coefficient and density-normalized longitudinal diffusion velocity are calculated in SF6-O2 and SF6-Air mixtures. The experimental results from a pulsed Townsend discharge are plotted for comparison with the numerical results. The reduced field strength varies from 40 Td to 500 Td (1 Townsend=10-17 V·cm2) and the SF6 concentration ranges from 10% to 100%. A Boltzmann equation associated with the two-term spherical harmonic expansion approximation is utilized to gain the swarm parameters in steady-state Townsend. Results show that the accuracy of the Boltzmann solution with a two-term expansion in calculating the electron drift velocity, electron energy distribution function, and density-normalized effective ionization coefficient is acceptable. The effective ionization coefficient presents a distinct relationship with the SF6 content in the mixtures. Moreover, the E/Ncr values in SF6-Air mixtures are higher than those in SF6-O2 mixtures and the calculated value E/Ncr in SF6-O2 and SF6-Air mixtures is lower than the measured value in SF6-N2. Parametric studies conducted on these parameters using the Boltzmann analysis offer substantial insight into the plasma physics, as well as a basis to explore the ozone generation process.

  2. Spherically symmetric Einstein-aether perfect fluid models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coley, Alan A.; Latta, Joey; Leon, Genly

    We investigate spherically symmetric cosmological models in Einstein-aether theory with a tilted (non-comoving) perfect fluid source. We use a 1+3 frame formalism and adopt the comoving aether gauge to derive the evolution equations, which form a well-posed system of first order partial differential equations in two variables. We then introduce normalized variables. The formalism is particularly well-suited for numerical computations and the study of the qualitative properties of the models, which are also solutions of Horava gravity. We study the local stability of the equilibrium points of the resulting dynamical system corresponding to physically realistic inhomogeneous cosmological models and astrophysicalmore » objects with values for the parameters which are consistent with current constraints. In particular, we consider dust models in (β−) normalized variables and derive a reduced (closed) evolution system and we obtain the general evolution equations for the spatially homogeneous Kantowski-Sachs models using appropriate bounded normalized variables. We then analyse these models, with special emphasis on the future asymptotic behaviour for different values of the parameters. Finally, we investigate static models for a mixture of a (necessarily non-tilted) perfect fluid with a barotropic equations of state and a scalar field.« less

  3. Determining inert content in coal dust/rock dust mixture

    DOEpatents

    Sapko, Michael J.; Ward, Jr., Jack A.

    1989-01-01

    A method and apparatus for determining the inert content of a coal dust and rock dust mixture uses a transparent window pressed against the mixture. An infrared light beam is directed through the window such that a portion of the infrared light beam is reflected from the mixture. The concentration of the reflected light is detected and a signal indicative of the reflected light is generated. A normalized value for the generated signal is determined according to the relationship .phi.=(log i.sub.c `log i.sub.co) / (log i.sub.c100 -log i.sub.co) where i.sub.co =measured signal at 0% rock dust i.sub.c100 =measured signal at 100% rock dust i.sub.c =measured signal of the mixture. This normalized value is then correlated to a predetermined relationship of .phi. to rock dust percentage to determine the rock dust content of the mixture. The rock dust content is displayed where the percentage is between 30 and 100%, and an indication of out-of-range is displayed where the rock dust percent is less than 30%. Preferably, the rock dust percentage (RD%) is calculated from the predetermined relationship RD%=100+30 log .phi.. where the dust mixture initially includes moisture, the dust mixture is dried before measuring by use of 8 to 12 mesh molecular-sieves which are shaken with the dust mixture and subsequently screened from the dust mixture.

  4. Association between rising professional liability insurance premiums and primary cesarean delivery rates.

    PubMed

    Murthy, Karna; Grobman, William A; Lee, Todd A; Holl, Jane L

    2007-12-01

    To estimate the association between changes in Illinois professional liability premiums for obstetrician-gynecologists and singleton primary cesarean delivery rates. Data from the National Center for Health Statistics were used to identify all singleton births between 37 weeks and 44 weeks of gestation occurring in Illinois from 1998 through 2003. Primary cesarean delivery rates for women delivered between 37 weeks and 44 weeks of gestation per 1,000 gravid women eligible to have a primary cesarean delivery were calculated for each Illinois county. The annual medical professional liability premium for each county in Illinois was represented by the reported professional liability insurance rate charges (adjusted to 2004 dollars) from the ISMIE Mutual Insurance Company. Separate analyses were conducted for nulliparous and multiparous women. The independent association between county-level primary cesarean delivery rates and the previous year's insurance premiums was evaluated using linear regression models. During the study period, 817,521 women were eligible for inclusion in the analysis. The county-level mean primary cesarean delivery rate increased from 126 to 163 per 1,000 (P<.001) eligible women, whereas the mean annual medical professional liability insurance premiums also rose significantly (from $60,766 in 1997 to $83,167 in 2002, P<.001). Multivariable analyses demonstrated that for each annual $10,000 insurance premium increase, the primary cesarean delivery rate increased by 15.7 per 1,000 for nulliparous women. This association also was evident for multiparous women, who had an increase in cesarean deliveries of 4.7 per 1,000 for every $10,000 increase. Higher rates of primary cesarean delivery are associated with increased medical professional liability premiums for obstetrician-gynecologists in Illinois. II.

  5. 76 FR 13656 - Notice of Submission of Proposed Information Collection to OMB Requirement for Contractors to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... insurance provide evidence that worker's compensation and general liability, automobile liability insurance... insurance provide evidence that worker's compensation and general liability, automobile liability insurance...

  6. Liability of professional and volunteer mental health practitioners in the wake of disasters: a framework for further considerations.

    PubMed

    Abdel-Monem, Tarik; Bulling, Denise

    2005-01-01

    Qualified immunity from civil liability exists for acts of disaster mental health (DMH) practitioners responding to disasters or acts of terrorism. This article reviews current legal regimens dictating civil liability for potentially wrongful acts of DMH professionals and volunteers responding to disasters. Criteria are proposed to inform determinations of civil liability for DMH workers in disaster response, given current legal parameters and established tort law in relevant areas. Specific considerations are examined that potentially implicate direct liability of DMH professionals and volunteers, and vicarious liability of DMH supervisors for actions of volunteer subordinates. The relevance of pre-event DMH planning and operationalization of the plan post-event is linked to considerations of liability. This article concludes with recommendations to minimize liability exposure for DMH workers in response efforts.

  7. Modelling household finances: A Bayesian approach to a multivariate two-part model

    PubMed Central

    Brown, Sarah; Ghosh, Pulak; Su, Li; Taylor, Karl

    2016-01-01

    We contribute to the empirical literature on household finances by introducing a Bayesian multivariate two-part model, which has been developed to further our understanding of household finances. Our flexible approach allows for the potential interdependence between the holding of assets and liabilities at the household level and also encompasses a two-part process to allow for differences in the influences on asset or liability holding and on the respective amounts held. Furthermore, the framework is dynamic in order to allow for persistence in household finances over time. Our findings endorse the joint modelling approach and provide evidence supporting the importance of dynamics. In addition, we find that certain independent variables exert different influences on the binary and continuous parts of the model thereby highlighting the flexibility of our framework and revealing a detailed picture of the nature of household finances. PMID:27212801

  8. Genetic and environmental influences on last-year major depression in adulthood: a highly heritable stable liability but strong environmental effects on 1-year prevalence.

    PubMed

    Kendler, K S; Gardner, C O

    2017-07-01

    This study seeks to clarify the contribution of temporally stable and occasion-specific genetic and environmental influences on risk for major depression (MD). Our sample was 2153 members of female-female twin pairs from the Virginia Twin Registry. We examined four personal interview waves conducted over an 8-year period with MD in the last year defined by DSM-IV criteria. We fitted a structural equation model to the data using classic Mx. The model included genetic and environmental risk factors for a latent, stable vulnerability to MD and for episodes in each of the four waves. The best-fit model was simple and included genetic and unique environmental influences on the latent liability to MD and unique wave-specific environmental effects. The path from latent liability to MD in the last year was constant over time, moderate in magnitude (+0.65) and weaker than the impact of occasion-specific environmental effects (+0.76). Heritability of the latent stable liability to MD was much higher (78%) than that estimated for last-year MD (32%). Of the total unique environmental influences on MD, 13% reflected enduring consequences of earlier environmental insults, 17% diagnostic error and 70% wave-specific short-lived environmental stressors. Both genetic influences on MD and MD heritability are stable over middle adulthood. However, the largest influence on last-year MD is short-lived environmental effects. As predicted by genetic theory, the heritability of MD is increased substantially by measurement at multiple time points largely through the reduction of the effects of measurement error and short-term environmental risk factors.

  9. Sworn testimony of the model evidence: Gaussian Mixture Importance (GAME) sampling

    NASA Astrophysics Data System (ADS)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-07-01

    What is the "best" model? The answer to this question lies in part in the eyes of the beholder, nevertheless a good model must blend rigorous theory with redeeming qualities such as parsimony and quality of fit. Model selection is used to make inferences, via weighted averaging, from a set of K candidate models, Mk; k=>(1,…,K>), and help identify which model is most supported by the observed data, Y>˜=>(y˜1,…,y˜n>). Here, we introduce a new and robust estimator of the model evidence, p>(Y>˜|Mk>), which acts as normalizing constant in the denominator of Bayes' theorem and provides a single quantitative measure of relative support for each hypothesis that integrates model accuracy, uncertainty, and complexity. However, p>(Y>˜|Mk>) is analytically intractable for most practical modeling problems. Our method, coined GAussian Mixture importancE (GAME) sampling, uses bridge sampling of a mixture distribution fitted to samples of the posterior model parameter distribution derived from MCMC simulation. We benchmark the accuracy and reliability of GAME sampling by application to a diverse set of multivariate target distributions (up to 100 dimensions) with known values of p>(Y>˜|Mk>) and to hypothesis testing using numerical modeling of the rainfall-runoff transformation of the Leaf River watershed in Mississippi, USA. These case studies demonstrate that GAME sampling provides robust and unbiased estimates of the evidence at a relatively small computational cost outperforming commonly used estimators. The GAME sampler is implemented in the MATLAB package of DREAM and simplifies considerably scientific inquiry through hypothesis testing and model selection.

  10. Predictive teratology: teratogenic risk-hazard identification partnered in the discovery process.

    PubMed

    Augustine-Rauch, K A

    2008-11-01

    Unexpected teratogenicity is ranked as one of the most prevalent causes for toxicity-related attrition of drug candidates. Without proactive assessment, the liability tends to be identified relatively late in drug development, following significant investment in compound and engagement in pre clinical and clinical studies. When unexpected teratogenicity occurs in pre-clinical development, three principle questions arise: Can clinical trials that include women of child bearing populations be initiated? Will all compounds in this pharmacological class produce the same liability? Could this effect be related to the chemical structure resulting in undesirable off-target adverse effects? The first question is typically addressed at the time of the unexpected finding and involves considering the nature of the teratogenicity, whether or not maternal toxicity could have had a role in onset, human exposure margins and therapeutic indication. The latter two questions can be addressed proactively, earlier in the discovery process as drug target profiling and lead compound optimization is taking place. Such proactive approaches include thorough assessment of the literature for identification of potential liabilities and follow-up work that can be conducted on the level of target expression and functional characterization using molecular biology and developmental model systems. Developmental model systems can also be applied in the form of in vitro teratogenicity screens, and show potential for effective hazard identification or issue resolution on the level of characterizing teratogenic mechanism. This review discusses approaches that can be applied for proactive assessment of compounds for teratogenic liability.

  11. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, Addendum

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    New results and insights concerning a previously published iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions were discussed. It was shown that the procedure converges locally to the consistent maximum likelihood estimate as long as a specified parameter is bounded between two limits. Bound values were given to yield optimal local convergence.

  12. Chemical mixtures in untreated water from public-supply wells in the U.S. — Occurrence, composition, and potential toxicity

    USGS Publications Warehouse

    Toccalino, Patricia L.; Norman, Julia E.; Scott, Jonathon C.

    2012-01-01

    Chemical mixtures are prevalent in groundwater used for public water supply, but little is known about their potential health effects. As part of a large-scale ambient groundwater study, we evaluated chemical mixtures across multiple chemical classes, and included more chemical contaminants than in previous studies of mixtures in public-supply wells. We (1) assessed the occurrence of chemical mixtures in untreated source-water samples from public-supply wells, (2) determined the composition of the most frequently occurring mixtures, and (3) characterized the potential toxicity of mixtures using a new screening approach. The U.S. Geological Survey collected one untreated water sample from each of 383 public wells distributed across 35 states, and analyzed the samples for as many as 91 chemical contaminants. Concentrations of mixture components were compared to individual human-health benchmarks; the potential toxicity of mixtures was characterized by addition of benchmark-normalized component concentrations. Most samples (84%) contained mixtures of two or more contaminants, each at concentrations greater than one-tenth of individual benchmarks. The chemical mixtures that most frequently occurred and had the greatest potential toxicity primarily were composed of trace elements (including arsenic, strontium, or uranium), radon, or nitrate. Herbicides, disinfection by-products, and solvents were the most common organic contaminants in mixtures. The sum of benchmark-normalized concentrations was greater than 1 for 58% of samples, suggesting that there could be potential for mixtures toxicity in more than half of the public-well samples. Our findings can be used to help set priorities for groundwater monitoring and suggest future research directions for drinking-water treatment studies and for toxicity assessments of chemical mixtures in water resources.

  13. Comparison of NIR chemical imaging with conventional NIR, Raman and ATR-IR spectroscopy for quantification of furosemide crystal polymorphs in ternary powder mixtures.

    PubMed

    Schönbichler, S A; Bittner, L K H; Weiss, A K H; Griesser, U J; Pallua, J D; Huck, C W

    2013-08-01

    The aim of this study was to evaluate the ability of near-infrared chemical imaging (NIR-CI), near-infrared (NIR), Raman and attenuated-total-reflectance infrared (ATR-IR) spectroscopy to quantify three polymorphic forms (I, II, III) of furosemide in ternary powder mixtures. For this purpose, partial least-squares (PLS) regression models were developed, and different data preprocessing algorithms such as normalization, standard normal variate (SNV), multiplicative scatter correction (MSC) and 1st to 3rd derivatives were applied to reduce the influence of systematic disturbances. The performance of the methods was evaluated by comparison of the standard error of cross-validation (SECV), R(2), and the ratio performance deviation (RPD). Limits of detection (LOD) and limits of quantification (LOQ) of all methods were determined. For NIR-CI, a SECVcorr-spec and a SECVsingle-pixel corrected were calculated to assess the loss of accuracy by taking advantage of the spatial information. NIR-CI showed a SECVcorr-spec (SECVsingle-pixel corrected) of 2.82% (3.71%), 3.49% (4.65%), and 4.10% (5.06%) for form I, II, III. NIR had a SECV of 2.98%, 3.62%, and 2.75%, and Raman reached 3.25%, 3.08%, and 3.18%. The SECV of the ATR-IR models were 7.46%, 7.18%, and 12.08%. This study proves that NIR-CI, NIR, and Raman are well suited to quantify forms I-III of furosemide in ternary mixtures. Because of the pressure-dependent conversion of form II to form I, ATR-IR was found to be less appropriate for an accurate quantification of the mixtures. In this study, the capability of NIR-CI for the quantification of polymorphic ternary mixtures was compared with conventional spectroscopic techniques for the first time. For this purpose, a new way of spectra selection was chosen, and two kinds of SECVs were calculated to achieve a better comparability of NIR-CI to NIR, Raman, and ATR-IR. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Segmentation and intensity estimation of microarray images using a gamma-t mixture model.

    PubMed

    Baek, Jangsun; Son, Young Sook; McLachlan, Geoffrey J

    2007-02-15

    We present a new approach to the analysis of images for complementary DNA microarray experiments. The image segmentation and intensity estimation are performed simultaneously by adopting a two-component mixture model. One component of this mixture corresponds to the distribution of the background intensity, while the other corresponds to the distribution of the foreground intensity. The intensity measurement is a bivariate vector consisting of red and green intensities. The background intensity component is modeled by the bivariate gamma distribution, whose marginal densities for the red and green intensities are independent three-parameter gamma distributions with different parameters. The foreground intensity component is taken to be the bivariate t distribution, with the constraint that the mean of the foreground is greater than that of the background for each of the two colors. The degrees of freedom of this t distribution are inferred from the data but they could be specified in advance to reduce the computation time. Also, the covariance matrix is not restricted to being diagonal and so it allows for nonzero correlation between R and G foreground intensities. This gamma-t mixture model is fitted by maximum likelihood via the EM algorithm. A final step is executed whereby nonparametric (kernel) smoothing is undertaken of the posterior probabilities of component membership. The main advantages of this approach are: (1) it enjoys the well-known strengths of a mixture model, namely flexibility and adaptability to the data; (2) it considers the segmentation and intensity simultaneously and not separately as in commonly used existing software, and it also works with the red and green intensities in a bivariate framework as opposed to their separate estimation via univariate methods; (3) the use of the three-parameter gamma distribution for the background red and green intensities provides a much better fit than the normal (log normal) or t distributions; (4) the use of the bivariate t distribution for the foreground intensity provides a model that is less sensitive to extreme observations; (5) as a consequence of the aforementioned properties, it allows segmentation to be undertaken for a wide range of spot shapes, including doughnut, sickle shape and artifacts. We apply our method for gridding, segmentation and estimation to cDNA microarray real images and artificial data. Our method provides better segmentation results in spot shapes as well as intensity estimation than Spot and spotSegmentation R language softwares. It detected blank spots as well as bright artifact for the real data, and estimated spot intensities with high-accuracy for the synthetic data. The algorithms were implemented in Matlab. The Matlab codes implementing both the gridding and segmentation/estimation are available upon request. Supplementary material is available at Bioinformatics online.

  15. The College Professor's Professional Liability

    ERIC Educational Resources Information Center

    Griggs, Walter S.; Rubin, Harvey W.

    1977-01-01

    The growing number of professional liability suits against professors warrants a close examination of the need for and provisions of available insurance coverage. The evolution of tort liability, the question of negligence, and the professional liability policy are discussed. (LBH)

  16. Solving multistage stochastic programming models of portfolio selection with outstanding liabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edirisinghe, C.

    1994-12-31

    Models for portfolio selection in the presence of an outstanding liability have received significant attention, for example, models for pricing options. The problem may be described briefly as follows: given a set of risky securities (and a riskless security such as a bond), and given a set of cash flows, i.e., outstanding liability, to be met at some future date, determine an initial portfolio and a dynamic trading strategy for the underlying securities such that the initial cost of the portfolio is within a prescribed wealth level and the expected cash surpluses arising from trading is maximized. While the tradingmore » strategy should be self-financing, there may also be other restrictions such as leverage and short-sale constraints. Usually the treatment is limited to binomial evolution of uncertainty (of stock price), with possible extensions for developing computational bounds for multinomial generalizations. Posing as stochastic programming models of decision making, we investigate alternative efficient solution procedures under continuous evolution of uncertainty, for discrete time economies. We point out an important moment problem arising in the portfolio selection problem, the solution (or bounds) on which provides the basis for developing efficient computational algorithms. While the underlying stochastic program may be computationally tedious even for a modest number of trading opportunities (i.e., time periods), the derived algorithms may used to solve problems whose sizes are beyond those considered within stochastic optimization.« less

  17. Cancer vulnerabilities unveiled by genomic loss

    PubMed Central

    Nijhawan, Deepak; Zack, Travis I.; Ren, Yin; Strickland, Matthew R.; Lamothe, Rebecca; Schumacher, Steven E.; Tsherniak, Aviad; Besche, Henrike C.; Rosenbluh, Joseph; Shehata, Shyemaa; Cowley, Glenn S.; Weir, Barbara A.; Goldberg, Alfred L.; Mesirov, Jill P.; Root, David E.; Bhatia, Sangeeta N.; Beroukhim, Rameen; Hahn, William C.

    2012-01-01

    Summary Due to genome instability, most cancers exhibit loss of regions containing tumor suppressor genes and collateral loss of other genes. To identify cancer-specific vulnerabilities that are the result of copy-number losses, we performed integrated analyses of genome-wide copy-number and RNAi profiles and identified 56 genes for which gene suppression specifically inhibited the proliferation of cells harboring partial copy-number loss of that gene. These CYCLOPS (Copy-number alterations Yielding Cancer Liabilities Owing to Partial losS) genes are enriched for spliceosome, proteasome and ribosome components. One CYCLOPS gene, PSMC2, encodes an essential member of the 19S proteasome. Normal cells express excess PSMC2, which resides in a complex with PSMC1, PSMD2, and PSMD5 and acts as a reservoir protecting cells from PSMC2 suppression. Cells harboring partial PSMC2 copy-number loss lack this complex and die after PSMC2 suppression. These observations define a distinct class of cancer-specific liabilities resulting from genome instability. PMID:22901813

  18. Production of Normal Mammalian Organ Culture Using a Medium Containing Mem-Alpha, Leibovitz L 15, Glucose Galactose Fructose

    NASA Technical Reports Server (NTRS)

    Goodwin, Thomas J. (Inventor); Wolf, David A. (Inventor); Spaulding, Glenn F. (Inventor); Prewett, Tacey L. (Inventor)

    1999-01-01

    Normal mammalian tissue and the culturing process has been developed for the three groups of organ, structural and blood tissue. The cells are grown in vitro under micro- gravity culture conditions and form three dimensional cells aggregates with normal cell function. The microgravity culture conditions may be microgravity or simulated microgravity created in a horizontal rotating wall culture vessel. The medium used for culturing the cells, especially a mixture of epithelial and mesenchymal cells contains a mixture of Mem-alpha and Leibovits L15 supplemented with glucose, galactose and fructose.

  19. Experimental evidence for killing the resistant cells and raising the efficacy and decreasing the toxicity of cytostatics and irradiation by mixtures of the agents of the passive antitumor defense system in the case of various tumor and normal cell lines in vitro.

    PubMed

    Kulcsár, Gyula

    2009-02-01

    Despite the substantial decline of the immune system in AIDS, only a few kinds of tumors increase in incidence. This shows that the immune system has no absolute role in the prevention of tumors. Therefore, the fact that tumors do not develop in the majority of the population during their lifetime indicates the existence of other defense system(s). According to our hypothesis, the defense is made by certain substances of the circulatory system. Earlier, on the basis of this hypothesis, we experimentally selected 16 substances of the circulatory system and demonstrated that the mixture of them (called active mixture) had a cytotoxic effect (inducing apoptosis) in vitro and in vivo on different tumor cell lines, but not on normal cells and animals. In this paper, we provide evidence that different cytostatic drugs or irradiation in combination with the active mixture killed significantly more cancer cells, compared with either treatments alone. The active mixture decreased, to a certain extent, the toxicity of cytostatics and irradiation on normal cells, but the most important result was that the active mixture destroyed the multidrug-resistant cells. Our results provide the possibility to improve the efficacy and reduce the side-effects of chemotherapy and radiation therapy and to prevent the relapse by killing the resistant cells.

  20. Analyzing repeated measures semi-continuous data, with application to an alcohol dependence study.

    PubMed

    Liu, Lei; Strawderman, Robert L; Johnson, Bankole A; O'Quigley, John M

    2016-02-01

    Two-part random effects models (Olsen and Schafer,(1) Tooze et al.(2)) have been applied to repeated measures of semi-continuous data, characterized by a mixture of a substantial proportion of zero values and a skewed distribution of positive values. In the original formulation of this model, the natural logarithm of the positive values is assumed to follow a normal distribution with a constant variance parameter. In this article, we review and consider three extensions of this model, allowing the positive values to follow (a) a generalized gamma distribution, (b) a log-skew-normal distribution, and (c) a normal distribution after the Box-Cox transformation. We allow for the possibility of heteroscedasticity. Maximum likelihood estimation is shown to be conveniently implemented in SAS Proc NLMIXED. The performance of the methods is compared through applications to daily drinking records in a secondary data analysis from a randomized controlled trial of topiramate for alcohol dependence treatment. We find that all three models provide a significantly better fit than the log-normal model, and there exists strong evidence for heteroscedasticity. We also compare the three models by the likelihood ratio tests for non-nested hypotheses (Vuong(3)). The results suggest that the generalized gamma distribution provides the best fit, though no statistically significant differences are found in pairwise model comparisons. © The Author(s) 2012.

  1. Deterministic annealing for density estimation by multivariate normal mixtures

    NASA Astrophysics Data System (ADS)

    Kloppenburg, Martin; Tavan, Paul

    1997-03-01

    An approach to maximum-likelihood density estimation by mixtures of multivariate normal distributions for large high-dimensional data sets is presented. Conventionally that problem is tackled by notoriously unstable expectation-maximization (EM) algorithms. We remove these instabilities by the introduction of soft constraints, enabling deterministic annealing. Our developments are motivated by the proof that algorithmically stable fuzzy clustering methods that are derived from statistical physics analogs are special cases of EM procedures.

  2. Characterization and quantification of grape variety by means of shikimic acid concentration and protein fingerprint in still white wines.

    PubMed

    Chabreyrie, David; Chauvet, Serge; Guyon, François; Salagoïty, Marie-Hélène; Antinelli, Jean-François; Medina, Bernard

    2008-08-27

    Protein profiles, obtained by high-performance capillary electrophoresis (HPCE) on white wines previously dialyzed, combined with shikimic acid concentration and multivariate analysis, were used for the determination of grape variety composition of a still white wine. Six varieties were studied through monovarietal wines elaborated in the laboratory: Chardonnay (24 samples), Chenin (24), Petit Manseng (7), Sauvignon (37), Semillon (24), and Ugni Blanc (9). Homemade mixtures were elaborated from authentic monovarietal wines according to a Plackett-Burman sampling plan. After protein peak area normalization, a matrix was elaborated containing protein results of wines (mixtures and monovarietal). Partial least-squares processing was applied to this matrix allowing the elaboration of a model that provided a varietal quantification precision of around 20% for most of the grape varieties studied. The model was applied to commercial samples from various geographical origins, providing encouraging results for control purposes.

  3. Modeling the use of a binary mixture as a control scheme for two-phase thermal systems

    NASA Technical Reports Server (NTRS)

    Benner, S. M.; Costello, Frederick A.

    1990-01-01

    Two-phase thermal loops using mechanical pumps, capillary pumps, or a combination of the two have been chosen as the main heat transfer systems for the space station. For these systems to operate optimally, the flow rate in the loop should be controlled in response to the vapor/liquid ratio leaving the evaporator. By substituting a mixture of two non-azeotropic fluids in place of the single fluid normally used in these systems, it may be possible to monitor the temperature of the exiting vapor and determine the vapor/liquid ratio. The flow rate would then be adjusted to maximize the load capability with minimum energy input. A FLUINT model was developed to study the system dynamics of a hybrid capillary pumped loop using this type of control and was found to be stable under all the test conditions.

  4. 75 FR 16645 - Increase in the Primary Nuclear Liability Insurance Premium

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... Primary Nuclear Liability Insurance Premium AGENCY: Nuclear Regulatory Commission. ACTION: Final rule... impractical. The NRC is amending its regulations to increase the primary premium for liability insurance... protection requirements and indemnity agreements to increase the primary nuclear liability insurance layer...

  5. A mixture theory model of fluid and solute transport in the microvasculature of normal and malignant tissues. I. Theory.

    PubMed

    Schuff, M M; Gore, J P; Nauman, E A

    2013-05-01

    In order to better understand the mechanisms governing transport of drugs, nanoparticle-based treatments, and therapeutic biomolecules, and the role of the various physiological parameters, a number of mathematical models have previously been proposed. The limitations of the existing transport models indicate the need for a comprehensive model that includes transport in the vessel lumen, the vessel wall, and the interstitial space and considers the effects of the solute concentration on fluid flow. In this study, a general model to describe the transient distribution of fluid and multiple solutes at the microvascular level was developed using mixture theory. The model captures the experimentally observed dependence of the hydraulic permeability coefficient of the capillary wall on the concentration of solutes present in the capillary wall and the surrounding tissue. Additionally, the model demonstrates that transport phenomena across the capillary wall and in the interstitium are related to the solute concentration as well as the hydrostatic pressure. The model is used in a companion paper to examine fluid and solute transport for the simplified case of an axisymmetric geometry with no solid deformation or interconversion of mass.

  6. Minimizing liability risks under the ACMG recommendations for reporting incidental findings in clinical exome and genome sequencing.

    PubMed

    Evans, Barbara J

    2013-12-01

    Recent recommendations by the American College of Medical Genetics and Genomics (ACMG) for reporting incidental findings present novel ethical and legal issues. This article expresses no views on the ethical aspects of these recommendations and focuses strictly on liability risks and how to minimize them. The recommendations place labs and clinicians in a new liability environment that exposes them to intentional tort lawsuits as well to traditional suits for negligence. Intentional tort suits are especially troubling because of their potential to inflict ruinous personal financial losses on individual clinicians and laboratory personnel. This article surveys this new liability landscape and describes analytical approaches for minimizing tort liabilities. To a considerable degree, liability risks can be controlled by structuring activities in ways that make future lawsuits nonviable before the suits ever arise. Proactive liability analysis is an effective tool for minimizing tort liabilities in connection with the testing and reporting activities that the ACMG recommends.

  7. Minimizing liability risks under the ACMG recommendations for reporting incidental findings in clinical exome and genome sequencing

    PubMed Central

    Evans, Barbara J.

    2014-01-01

    Recent recommendations by the American College of Medical Genetics and Genomics (ACMG) for reporting incidental findings present novel ethical and legal issues. This article expresses no views on the ethical aspects of these recommendations and focuses strictly on liability risks and how to minimize them. The recommendations place labs and clinicians in a new liability environment that exposes them to intentional tort lawsuits as well to traditional suits for negligence. Intentional tort suits are especially troubling because of their potential to inflict ruinous personal financial losses on individual clinicians and laboratory personnel. This article surveys this new liability landscape and describes analytical approaches for minimizing tort liabilities. To a considerable degree, liability risks can be controlled by structuring activities in ways that make future lawsuits nonviable before the suits ever arise. Proactive liability analysis is an effective tool for minimizing tort liabilities in connection with the testing and reporting activities that the ACMG recommends. PMID:24030435

  8. Combustion of Gaseous Mixtures

    NASA Technical Reports Server (NTRS)

    Duchene, R

    1932-01-01

    This report not only presents matters of practical importance in the classification of engine fuels, for which other means have proved inadequate, but also makes a few suggestions. It confirms the results of Withrow and Boyd which localize the explosive wave in the last portions of the mixture burned. This being the case, it may be assumed that the greater the normal combustion, the less the energy developed in the explosive form. In order to combat the detonation, it is therefore necessary to try to render the normal combustion swift and complete, as produced in carbureted mixtures containing benzene (benzol), in which the flame propagation, beginning at the spark, yields a progressive and pronounced darkening on the photographic film.

  9. 26 CFR 1.704-2 - Allocations attributable to nonrecourse liabilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... disparity. (4) Special rule for year of revaluation. (e) Requirements to be satisfied. (f) Minimum gain... encumbers, a disposition of that property will generate gain that at least equals that excess (“partnership.... (3) Definition of nonrecourse liability. Nonrecourse liability means a nonrecourse liability as...

  10. 14 CFR 291.22 - Aircraft accident liability insurance requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Aircraft accident liability insurance... for All-Cargo Air Transportation § 291.22 Aircraft accident liability insurance requirement. No air... and maintains in effect aircraft accident liability coverage that meets the requirements of part 205...

  11. 46 CFR 5.69 - Evidence of criminal liability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... INVESTIGATION REGULATIONS-PERSONNEL ACTION Statement of Policy and Interpretation § 5.69 Evidence of criminal liability. Evidence of criminal liability discovered during an investigation or hearing conducted pursuant... 46 Shipping 1 2010-10-01 2010-10-01 false Evidence of criminal liability. 5.69 Section 5.69...

  12. 14 CFR 1260.61 - Allocation of risk/liability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Allocation of risk/liability. 1260.61 Section 1260.61 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION GRANTS AND COOPERATIVE AGREEMENTS General Special Conditions § 1260.61 Allocation of risk/liability. Allocation of Risk/Liability...

  13. Theoretical Calculation of the Electron Transport Parameters and Energy Distribution Function for CF3I with noble gases mixtures using Monte Carlo simulation program

    NASA Astrophysics Data System (ADS)

    Jawad, Enas A.

    2018-05-01

    In this paper, The Monte Carlo simulation program has been used to calculation the electron energy distribution function (EEDF) and electric transport parameters for the gas mixtures of The trif leoroiodo methane (CF3I) ‘environment friendly’ with a noble gases (Argon, Helium, kryptos, Neon and Xenon). The electron transport parameters are assessed in the range of E/N (E is the electric field and N is the gas number density of background gas molecules) between 100 to 2000Td (1 Townsend = 10-17 V cm2) at room temperature. These parameters, namely are electron mean energy (ε), the density –normalized longitudinal diffusion coefficient (NDL) and the density –normalized mobility (μN). In contrast, the impact of CF3I in the noble gases mixture is strongly apparent in the values for the electron mean energy, the density –normalized longitudinal diffusion coefficient and the density –normalized mobility. Note in the results of the calculation agreed well with the experimental results.

  14. Modelling Spatial Dependence Structures Between Climate Variables by Combining Mixture Models with Copula Models

    NASA Astrophysics Data System (ADS)

    Khan, F.; Pilz, J.; Spöck, G.

    2017-12-01

    Spatio-temporal dependence structures play a pivotal role in understanding the meteorological characteristics of a basin or sub-basin. This further affects the hydrological conditions and consequently will provide misleading results if these structures are not taken into account properly. In this study we modeled the spatial dependence structure between climate variables including maximum, minimum temperature and precipitation in the Monsoon dominated region of Pakistan. For temperature, six, and for precipitation four meteorological stations have been considered. For modelling the dependence structure between temperature and precipitation at multiple sites, we utilized C-Vine, D-Vine and Student t-copula models. For temperature, multivariate mixture normal distributions and for precipitation gamma distributions have been used as marginals under the copula models. A comparison was made between C-Vine, D-Vine and Student t-copula by observational and simulated spatial dependence structure to choose an appropriate model for the climate data. The results show that all copula models performed well, however, there are subtle differences in their performances. The copula models captured the patterns of spatial dependence structures between climate variables at multiple meteorological sites, however, the t-copula showed poor performance in reproducing the dependence structure with respect to magnitude. It was observed that important statistics of observed data have been closely approximated except of maximum values for temperature and minimum values for minimum temperature. Probability density functions of simulated data closely follow the probability density functions of observational data for all variables. C and D-Vines are better tools when it comes to modelling the dependence between variables, however, Student t-copulas compete closely for precipitation. Keywords: Copula model, C-Vine, D-Vine, Spatial dependence structure, Monsoon dominated region of Pakistan, Mixture models, EM algorithm.

  15. On the cause of the non-Gaussian distribution of residuals in geomagnetism

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Khokhlov, A.

    2017-12-01

    To describe errors in the data, Gaussian distributions naturally come to mind. In many practical instances, indeed, Gaussian distributions are appropriate. In the broad field of geomagnetism, however, it has repeatedly been noted that residuals between data and models often display much sharper distributions, sometimes better described by a Laplace distribution. In the present study, we make the case that such non-Gaussian behaviors are very likely the result of what is known as mixture of distributions in the statistical literature. Mixtures arise as soon as the data do not follow a common distribution or are not properly normalized, the resulting global distribution being a mix of the various distributions followed by subsets of the data, or even individual datum. We provide examples of the way such mixtures can lead to distributions that are much sharper than Gaussian distributions and discuss the reasons why such mixtures are likely the cause of the non-Gaussian distributions observed in geomagnetism. We also show that when properly selecting sub-datasets based on geophysical criteria, statistical mixture can sometimes be avoided and much more Gaussian behaviors recovered. We conclude with some general recommendations and point out that although statistical mixture always tends to sharpen the resulting distribution, it does not necessarily lead to a Laplacian distribution. This needs to be taken into account when dealing with such non-Gaussian distributions.

  16. Estimation and confidence intervals for empirical mixing distributions

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    1995-01-01

    Questions regarding collections of parameter estimates can frequently be expressed in terms of an empirical mixing distribution (EMD). This report discusses empirical Bayes estimation of an EMD, with emphasis on the construction of interval estimates. Estimation of the EMD is accomplished by substitution of estimates of prior parameters in the posterior mean of the EMD. This procedure is examined in a parametric model (the normal-normal mixture) and in a semi-parametric model. In both cases, the empirical Bayes bootstrap of Laird and Louis (1987, Journal of the American Statistical Association 82, 739-757) is used to assess the variability of the estimated EMD arising from the estimation of prior parameters. The proposed methods are applied to a meta-analysis of population trend estimates for groups of birds.

  17. Mixed Model Association with Family-Biased Case-Control Ascertainment.

    PubMed

    Hayeck, Tristan J; Loh, Po-Ru; Pollack, Samuela; Gusev, Alexander; Patterson, Nick; Zaitlen, Noah A; Price, Alkes L

    2017-01-05

    Mixed models have become the tool of choice for genetic association studies; however, standard mixed model methods may be poorly calibrated or underpowered under family sampling bias and/or case-control ascertainment. Previously, we introduced a liability threshold-based mixed model association statistic (LTMLM) to address case-control ascertainment in unrelated samples. Here, we consider family-biased case-control ascertainment, where case and control subjects are ascertained non-randomly with respect to family relatedness. Previous work has shown that this type of ascertainment can severely bias heritability estimates; we show here that it also impacts mixed model association statistics. We introduce a family-based association statistic (LT-Fam) that is robust to this problem. Similar to LTMLM, LT-Fam is computed from posterior mean liabilities (PML) under a liability threshold model; however, LT-Fam uses published narrow-sense heritability estimates to avoid the problem of biased heritability estimation, enabling correct calibration. In simulations with family-biased case-control ascertainment, LT-Fam was correctly calibrated (average χ 2 = 1.00-1.02 for null SNPs), whereas the Armitage trend test (ATT), standard mixed model association (MLM), and case-control retrospective association test (CARAT) were mis-calibrated (e.g., average χ 2 = 0.50-1.22 for MLM, 0.89-2.65 for CARAT). LT-Fam also attained higher power than other methods in some settings. In 1,259 type 2 diabetes-affected case subjects and 5,765 control subjects from the CARe cohort, downsampled to induce family-biased ascertainment, LT-Fam was correctly calibrated whereas ATT, MLM, and CARAT were again mis-calibrated. Our results highlight the importance of modeling family sampling bias in case-control datasets with related samples. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  18. Generalized weighted likelihood density estimators with application to finite mixture of exponential family distributions

    PubMed Central

    Zhan, Tingting; Chevoneva, Inna; Iglewicz, Boris

    2010-01-01

    The family of weighted likelihood estimators largely overlaps with minimum divergence estimators. They are robust to data contaminations compared to MLE. We define the class of generalized weighted likelihood estimators (GWLE), provide its influence function and discuss the efficiency requirements. We introduce a new truncated cubic-inverse weight, which is both first and second order efficient and more robust than previously reported weights. We also discuss new ways of selecting the smoothing bandwidth and weighted starting values for the iterative algorithm. The advantage of the truncated cubic-inverse weight is illustrated in a simulation study of three-components normal mixtures model with large overlaps and heavy contaminations. A real data example is also provided. PMID:20835375

  19. 12 CFR 965.2 - Authorized liabilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Authorized liabilities. 965.2 Section 965.2 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK LIABILITIES SOURCE OF FUNDS § 965.2 Authorized liabilities. As a source of funds for business operations, each Bank is authorized to...

  20. 24 CFR 203.422 - Right and liability under Mutual Mortgage Insurance Fund.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Mortgage Insurance Fund and Distributive Shares § 203.422 Right and liability under Mutual Mortgage... to any liability arising under the mutuality of the Mutual Mortgage Insurance Fund. ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Right and liability under Mutual...

  1. 24 CFR 203.422 - Right and liability under Mutual Mortgage Insurance Fund.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Mortgage Insurance Fund and Distributive Shares § 203.422 Right and liability under Mutual Mortgage... to any liability arising under the mutuality of the Mutual Mortgage Insurance Fund. ... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Right and liability under Mutual...

  2. 12 CFR 704.8 - Asset and liability management.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CORPORATE CREDIT UNIONS § 704.8 Asset and liability management. (a) Policies. A corporate credit union must...) The purpose and objectives of the corporate credit union's asset and liability activities; (2) The... used as a basis of estimation. (b) Asset and liability management committee (ALCO). A corporate credit...

  3. 12 CFR 704.8 - Asset and liability management.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CORPORATE CREDIT UNIONS § 704.8 Asset and liability management. (a) Policies. A corporate credit union must...) The purpose and objectives of the corporate credit union's asset and liability activities; (2) The... used as a basis of estimation. (b) Asset and liability management committee (ALCO). A corporate credit...

  4. 14 CFR 1274.916 - Liability and risk of loss.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Liability and risk of loss. 1274.916... AGREEMENTS WITH COMMERCIAL FIRMS Other Provisions and Special Conditions § 1274.916 Liability and risk of..., or indemnification of, developers of experimental aerospace vehicles. Liability and Risk of Loss July...

  5. 12 CFR 229.21 - Civil liability.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Civil liability. 229.21 Section 229.21 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM... Disclosure of Funds Availability Policies § 229.21 Civil liability. (a) Civil liability. A bank that fails to...

  6. 12 CFR 229.21 - Civil liability.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Civil liability. 229.21 Section 229.21 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM... Availability Policies § 229.21 Civil liability. (a) Civil liability. A bank that fails to comply with any...

  7. 12 CFR 229.21 - Civil liability.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 3 2012-01-01 2012-01-01 false Civil liability. 229.21 Section 229.21 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM... Availability Policies § 229.21 Civil liability. (a) Civil liability. A bank that fails to comply with any...

  8. 12 CFR 229.21 - Civil liability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Civil liability. 229.21 Section 229.21 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM... Availability Policies § 229.21 Civil liability. (a) Civil liability. A bank that fails to comply with any...

  9. 12 CFR 229.21 - Civil liability.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Civil liability. 229.21 Section 229.21 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM... Disclosure of Funds Availability Policies § 229.21 Civil liability. (a) Civil liability. A bank that fails to...

  10. CEC-normalized clay-water sorption isotherm

    NASA Astrophysics Data System (ADS)

    Woodruff, W. F.; Revil, A.

    2011-11-01

    A normalized clay-water isotherm model based on BET theory and describing the sorption and desorption of the bound water in clays, sand-clay mixtures, and shales is presented. Clay-water sorption isotherms (sorption and desorption) of clayey materials are normalized by their cation exchange capacity (CEC) accounting for a correction factor depending on the type of counterion sorbed on the mineral surface in the so-called Stern layer. With such normalizations, all the data collapse into two master curves, one for sorption and one for desorption, independent of the clay mineralogy, crystallographic considerations, and bound cation type; therefore, neglecting the true heterogeneity of water sorption/desorption in smectite. The two master curves show the general hysteretic behavior of the capillary pressure curve at low relative humidity (below 70%). The model is validated against several data sets obtained from the literature comprising a broad range of clay types and clay mineralogies. The CEC values, derived by inverting the sorption/adsorption curves using a Markov chain Monte Carlo approach, are consistent with the CEC associated with the clay mineralogy.

  11. Incorporation of β-glucans in meat emulsions through an optimal mixture modeling systems.

    PubMed

    Vasquez Mejia, Sandra M; de Francisco, Alicia; Manique Barreto, Pedro L; Damian, César; Zibetti, Andre Wüst; Mahecha, Hector Suárez; Bohrer, Benjamin M

    2018-09-01

    The effects of β-glucans (βG) in beef emulsions with carrageenan and starch were evaluated using an optimal mixture modeling system. The best mathematical models to describe the cooking loss, color, and textural profile analysis (TPA) were selected and optimized. The cubic models were better to describe the cooking loss, color, and TPA parameters, with the exception of springiness. Emulsions with greater levels of βG and starch had less cooking loss (<1%), intermediate L* (>54 and <62), and greater hardness, cohesiveness and springiness values. Subsequently, during the optimization phase, the use of carrageenan was eliminated. The optimized emulsion contained 3.13 ± 0.11% βG, which could cover the intake daily of βG recommendations. However, the hardness of the optimized emulsion was greater (60,224 ± 1025 N) than expected. The optimized emulsion had a homogeneous structure and normal thermal behavior by DSC and allowed for the manufacture of products with high amounts of βG and desired functional attributes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. The effects of temperature on nitrous oxide and oxygen mixture homogeneity and stability.

    PubMed

    Litwin, Patrick D

    2010-10-15

    For many long standing practices, the rationale for them is often lost as time passes. This is the situation with respect to the storage and handling of equimolar 50% nitrous oxide and 50% oxygen volume/volume (v/v) mixtures. A review was undertaken of existing literature to examine the developmental history of nitrous oxide and oxygen mixtures for anesthesia and analgesia and to ascertain if sufficient bibliographic data was available to support the position that the contents of a cylinder of a 50%/50% volume/volume (v/v) mixture of nitrous oxide and oxygen is in a homogenous single gas phase in a filled cylinder under normal conditions of handling and storage and if justification could be found for the standard instructions given for handling before use. After ranking and removing duplicates, a total of fifteen articles were identified by the various search strategies and formed the basis of this literature review. Several studies were identified that confirmed that 50%/50% v/v mixture of nitrous oxide and oxygen is in a homogenous single gas phase in a filled cylinder under normal conditions of handling and storage. The effect of temperature on the change of phase of the nitrous oxide in this mixture was further examined by several authors. These studies demonstrated that although it is possible to cause condensation and phase separation by cooling the cylinder, by allowing the cylinder to rewarm to room temperature for at least 48 hours, preferably in a horizontal orientation, and inverting it three times before use, the cylinder consistently delivered the proper proportions of the component gases as a homogenous mixture. The contents of a cylinder of a 50%/50% volume/volume (v/v) mixture of nitrous oxide and oxygen is in a homogenous single gas phase in a filled cylinder under normal conditions of handling and storage. The standard instructions given for handling before are justified based on previously conducted studies.

  13. [Organisational responsibility versus individual responsibility: safety culture? About the relationship between patient safety and medical malpractice law].

    PubMed

    Hart, Dieter

    2009-01-01

    The contribution is concerned with the correlations between risk information, patient safety, responsibility and liability, in particular in terms of liability law. These correlations have an impact on safety culture in healthcare, which can be evaluated positively if--in addition to good quality of medical care--as many sources of error as possible can be identified, analysed, and minimised or eliminated by corresponding measures (safety or risk management). Liability influences the conduct of individuals and enterprises; safety is (probably) also a function of liability; this should also apply to safety culture. The standard of safety culture does not only depend on individual liability for damages, but first of all on strict enterprise liability (system responsibility) and its preventive effects. Patient safety through quality and risk management is therefore also an organisational programme of considerable relevance in terms of liability law.

  14. 26 CFR 50.5 - Liability for the tax.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 17 2011-04-01 2011-04-01 false Liability for the tax. 50.5 Section 50.5... TAXES (CONTINUED) REGULATIONS RELATING TO THE TAX IMPOSED WITH RESPECT TO CERTAIN HYDRAULIC MINING § 50.5 Liability for the tax. Liability for tax attaches to any person engaged at any time during the...

  15. 26 CFR 301.7122-1 - Compromises.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... internal revenue laws prior to reference of a case involving such a liability to the Department of Justice... existence or amount of the correct tax liability under the law. Doubt as to liability does not exist where... of the full liability would undermine public confidence that the tax laws are being administered in a...

  16. 26 CFR 301.7122-1 - Compromises.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... internal revenue laws prior to reference of a case involving such a liability to the Department of Justice... existence or amount of the correct tax liability under the law. Doubt as to liability does not exist where... of the full liability would undermine public confidence that the tax laws are being administered in a...

  17. 25 CFR 141.57 - Procedures to cancel liability on bond.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Procedures to cancel liability on bond. 141.57 Section 141.57 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES BUSINESS... Procedures to cancel liability on bond. (a) Any surety who wishes to be relieved from liability arising on a...

  18. Governing Board and Administrator Liability. ERIC/Higher Education Research Report No. 9.

    ERIC Educational Resources Information Center

    Hendrickson, Robert M.; Mangum, Ronald Scott

    Matters of legal liability that are of concern to institutions of higher education are discussed in some detail in language for the layman. Among the subjects discussed are: the development of charitable corporations, and immunity prerogatives; the traditional bases of legal liability; liability for the new torts, including violation of…

  19. 76 FR 80410 - Pendency of Request for Approval of Special Withdrawal Liability Rules; the Cultural Institutions...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... Liability Rules; the Cultural Institutions Pension Plan AGENCY: Pension Benefit Guaranty Corporation. ACTION... approval of a plan amendment providing for special withdrawal liability rules. Under Sec. 4203(f) of the... Liability Rules, a multiemployer pension plan may, with PBGC approval, be amended to provide for special...

  20. 37 CFR 10.78 - Limiting liability to client.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Limiting liability to client... Office Code of Professional Responsibility § 10.78 Limiting liability to client. A practitioner shall not attempt to exonerate himself or herself from, or limit his or her liability to, a client for his or her...

  1. 26 CFR 1.404(g)-1 - Deduction of employer liability payments.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 5 2012-04-01 2011-04-01 true Deduction of employer liability payments. 1.404(g)-1 Section 1.404(g)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY.... § 1.404(g)-1 Deduction of employer liability payments. (a) General rule. Employer liability payments...

  2. 48 CFR 47.207-7 - Liability and insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... liability for injury to persons or damage to property other than the freight being transported; (2) The contractor's liability for loss of and/or damage to the freight being transported; and (3) The amount of... damage to the freight being transported is not specified, the usual measure of liability as prescribed in...

  3. 48 CFR 47.207-7 - Liability and insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... liability for injury to persons or damage to property other than the freight being transported; (2) The contractor's liability for loss of and/or damage to the freight being transported; and (3) The amount of... damage to the freight being transported is not specified, the usual measure of liability as prescribed in...

  4. 48 CFR 47.207-7 - Liability and insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... liability for injury to persons or damage to property other than the freight being transported; (2) The contractor's liability for loss of and/or damage to the freight being transported; and (3) The amount of... damage to the freight being transported is not specified, the usual measure of liability as prescribed in...

  5. 48 CFR 47.207-7 - Liability and insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... liability for injury to persons or damage to property other than the freight being transported; (2) The contractor's liability for loss of and/or damage to the freight being transported; and (3) The amount of... damage to the freight being transported is not specified, the usual measure of liability as prescribed in...

  6. 48 CFR 47.207-7 - Liability and insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... liability for injury to persons or damage to property other than the freight being transported; (2) The contractor's liability for loss of and/or damage to the freight being transported; and (3) The amount of... damage to the freight being transported is not specified, the usual measure of liability as prescribed in...

  7. 26 CFR 1.338-5 - Adjusted grossed-up basis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... target. AGUB is the amount for which new target is deemed to have purchased all of its assets in the... (iii) The liabilities of new target. (2) Time and amount of AGUB—(i) Original determination. AGUB is.... (e) Liabilities of new target—(1) In general. The liabilities of new target are the liabilities of...

  8. 26 CFR 50.5 - Liability for the tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 17 2010-04-01 2010-04-01 false Liability for the tax. 50.5 Section 50.5... TAXES (CONTINUED) REGULATIONS RELATING TO THE TAX IMPOSED WITH RESPECT TO CERTAIN HYDRAULIC MINING § 50.5 Liability for the tax. Liability for tax attaches to any person engaged at any time during the...

  9. Teacher Liability in School-Shop Accidents.

    ERIC Educational Resources Information Center

    Kegin, Denis J.

    The intent of the book is to stimulate interest in the problem of shop-teacher liability and to identify certain needs which have not been adequately met by existing laws and statutes. Chapter 1, The Significance of Teacher Liability, discusses basic legal considerations, the environment of the school shop, and the possibility of liability.…

  10. 40 CFR 113.4 - Size classes and associated liability limits for fixed onshore oil storage facilities, 1,000...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Size classes and associated liability... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS LIABILITY LIMITS FOR... privity and knowledge of the owner or operator, the following limits of liability are established for...

  11. 14 CFR 1266.104 - Cross-waiver of liability for launch agreements for science or space exploration activities...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... agreements for science or space exploration activities unrelated to the International Space Station. 1266.104... LIABILITY § 1266.104 Cross-waiver of liability for launch agreements for science or space exploration... cross-waiver of liability between the parties to agreements for NASA's science or space exploration...

  12. 14 CFR 1266.104 - Cross-waiver of liability for launch agreements for science or space exploration activities...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... agreements for science or space exploration activities unrelated to the International Space Station. 1266.104... LIABILITY § 1266.104 Cross-waiver of liability for launch agreements for science or space exploration... cross-waiver of liability between the parties to agreements for NASA's science or space exploration...

  13. 14 CFR 1266.104 - Cross-waiver of liability for launch agreements for science or space exploration activities...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... agreements for science or space exploration activities unrelated to the International Space Station. 1266.104... LIABILITY § 1266.104 Cross-waiver of liability for launch agreements for science or space exploration... cross-waiver of liability between the parties to agreements for NASA's science or space exploration...

  14. 29 CFR 790.4 - Liability of employer; effect of contract, custom, or practice.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., this section relieves the employer from certain liabilities or punishments to which he might otherwise... by such employee is relieved from liability or punishment therefor if, and only if, such activities... an employer of liability or punishment only with respect to activities of the kind described, which...

  15. 29 CFR 790.4 - Liability of employer; effect of contract, custom, or practice.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., this section relieves the employer from certain liabilities or punishments to which he might otherwise... by such employee is relieved from liability or punishment therefor if, and only if, such activities... an employer of liability or punishment only with respect to activities of the kind described, which...

  16. 29 CFR 790.4 - Liability of employer; effect of contract, custom, or practice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., this section relieves the employer from certain liabilities or punishments to which he might otherwise... by such employee is relieved from liability or punishment therefor if, and only if, such activities... an employer of liability or punishment only with respect to activities of the kind described, which...

  17. 29 CFR 790.4 - Liability of employer; effect of contract, custom, or practice.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., this section relieves the employer from certain liabilities or punishments to which he might otherwise... by such employee is relieved from liability or punishment therefor if, and only if, such activities... an employer of liability or punishment only with respect to activities of the kind described, which...

  18. Understanding the covariation of tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms: A population-based adult twin study.

    PubMed

    Pinto, Rebecca; Monzani, Benedetta; Leckman, James F; Rück, Christian; Serlachius, Eva; Lichtenstein, Paul; Mataix-Cols, David

    2016-10-01

    Chronic tic disorders (TD), attention-deficit/hyperactivity-disorder (ADHD), and obsessive-compulsive disorder (OCD) frequently co-occur in clinical and epidemiological samples. Family studies have found evidence of shared familial transmission between TD and OCD, whereas the familial association between these disorders and ADHD is less clear. This study aimed to investigate to what extent liability of tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms is caused by shared or distinct genetic or environmental influences, in a large population-representative sample of Swedish adult twins (n = 21,911). Tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms showed modest, but significant covariation. Model fitting suggested a latent liability factor underlying the three phenotypes. This common factor was relatively heritable, and explained significantly less of the variance of attention-deficit/hyperactivity symptom liability. The majority of genetic variance was specific rather than shared. The greatest proportion of total variance in liability of tics, attention-deficit/hyperactivity, and obsessive-compulsive symptoms was attributed to specific non-shared environmental influences. Our findings suggest that the co-occurrence of tics and obsessive-compulsive symptoms, and to a lesser extent attention-deficit/hyperactivity symptoms, can be partly explained by shared etiological influences. However, these phenotypes do not appear to be alternative expressions of the same underlying genetic liability. Further research examining sub-dimensions of these phenotypes may serve to further clarify the association between these disorders and identify more genetically homogenous symptom subtypes. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. An experimental study of catechol-o-methyltransferase Val158Met moderation of delta-9-tetrahydrocannabinol-induced effects on psychosis and cognition.

    PubMed

    Henquet, Cécile; Rosa, Araceli; Krabbendam, Lydia; Papiol, Sergi; Fananás, Lourdes; Drukker, Marjan; Ramaekers, Johannes G; van Os, Jim

    2006-12-01

    Observational studies have suggested that psychometric psychosis liability and a functional polymorphism in the catechol-O-methyltransferase (COMT Val(158)Met) gene moderate the psychosis-inducing effect of cannabis. To replicate and extend this finding, a double-blind, placebo-controlled cross-over design was used in which patients with a psychotic disorder (n=30), relatives of patients with a psychotic disorder (n=12), and healthy controls (n=32) were exposed to Delta-9-tetrahydrocannabinol (Delta-9-THC, the principal component of cannabis) or placebo, followed by cognitive assessment and assessment of current psychotic experiences. Previous expression of psychometric psychosis liability was also assessed. Models of current psychotic experiences and cognition were examined with multilevel random regression analyses to assess (i) main effects of genotype and condition, (ii) interactions between condition and genotype, and (iii) three-way interactions between condition, genotype, and psychometric psychosis liability. Carriers of the Val allele were most sensitive to Delta-9-THC-induced psychotic experiences, but this was conditional on prior evidence of psychometric psychosis liability. Delta-9-THC impacted negatively on cognitive measures. Carriers of the Val allele were also more sensitive to Delta-9-THC-induced memory and attention impairments compared to carriers of the Met allele. Experimental effects of Delta-9-THC on cognition and psychosis are moderated by COMT Val(158)Met genotype, but the effects may in part be conditional on the additional presence of pre-existing psychosis liability. The association between cannabis and psychosis may represent higher order gene-environment and gene-gene interactions.

  20. Liability concerns and shared use of school recreational facilities in underserved communities.

    PubMed

    Spengler, John O; Connaughton, Daniel P; Maddock, Jason E

    2011-10-01

    In underserved communities, schools can provide the physical structure and facilities for informal and formal recreation as well as after-school, weekend, and summer programming. The importance of community access to schools is acknowledged by authoritative groups; however, fear of liability is believed to be a key barrier to community access. The purpose of this study was to investigate perceptions of liability risk and associated issues among school administrators in underserved communities. A national survey of school administrators in underserved communities (n=360, response rate of 21%) was conducted in 2009 and analyzed in 2010. Liability perceptions in the context of community access were assessed through descriptive statistics. The majority of respondents (82.2%) indicated concern for liability should someone be injured on school property after hours while participating in a recreational activity. Among those that did not allow community access, 91% were somewhat to very concerned about liability and 86% believed that stronger legislation was needed to better protect schools from liability for after-hours recreational use. Among those who claimed familiarity with a state law that offered them limited liability protection, nearly three fourths were nevertheless concerned about liability. Liability concerns are prevalent among this group of school administrators, particularly if they had been involved in prior litigation, and even if they indicated they were aware of laws that provide liability protection where use occurs after hours. Reducing these concerns will be important if schools are to become locations for recreational programs that promote physical activity outside of regular school hours. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Opportunities for the replacement of animals in the study of nausea and vomiting

    PubMed Central

    Holmes, AM; Rudd, JA; Tattersall, FD; Aziz, Q; Andrews, PLR

    2009-01-01

    Nausea and vomiting are among the most common symptoms encountered in medicine as either symptoms of disease or side effects of treatments. Developing novel anti-emetics and identifying emetic liability in novel chemical entities rely on models that can recreate the complexity of these multi-system reflexes. Animal models (especially the ferret and dog) are the current gold standard; however, the selection of appropriate models is still a matter of debate, especially when studying the subjective human sensation of nausea. Furthermore, these studies are associated with animal suffering. Here, following a recent workshop held to review the utility of animal models in nausea and vomiting research, we discuss the limitations of some of the current models in the context of basic research, anti-emetic development and emetic liability detection. We provide suggestions for how these limitations may be overcome using non-animal alternatives, including greater use of human volunteers, in silico and in vitro techniques and lower organisms. PMID:19371333

  2. Genetic and environmental influences on the familial transmission of externalizing disorders in adoptive and twin offspring.

    PubMed

    Hicks, Brian M; Foster, Katherine T; Iacono, William G; McGue, Matt

    2013-10-01

    Twin-family studies have shown that parent-child resemblance on substance use disorders and antisocial behavior can be accounted for by the transmission of a general liability to a spectrum of externalizing disorders. Most studies, however, include only biological parents and offspring, which confound genetic and environmental transmission effects. To examine the familial transmission of externalizing disorders among both adoptive (genetically unrelated) and biological relatives to better distinguish genetic and environmental mechanisms of transmission. Family study design wherein each family included the mother, father, and 2 offspring, including monozygotic twin, dizygotic twin, nontwin biological, and adoptive offspring. Structural equation modeling was used to estimate familial transmission effects and their genetic and environmental influences. Participants were recruited from the community and assessed at a university laboratory. A total of 1590 families with biological offspring and 409 families with adoptive offspring. Offspring participants were young adults (mean age, 26.2 years). Symptom counts of conduct disorder, adult antisocial behavior, and alcohol, nicotine, and drug dependence. RESULTS There was a medium effect for the transmission of the general externalizing liability for biological parents (r = 0.27-0.30) but not for adoptive parents (r = 0.03-0.07). In contrast, adoptive siblings exhibited significant similarity on the general externalizing liability (r = 0.21). Biometric analyses revealed that the general externalizing liability was highly heritable (a2 = 0.61) but also exhibited significant shared environmental influences (c2 = 0.20). Parent-child resemblance for substance use disorders and antisocial behavior is primarily due to the genetic transmission of a general liability to a spectrum of externalizing disorders. Including adoptive siblings revealed a greater role of shared environmental influences on the general externalizing liability than previously detected in twin studies and indicates that sibling rather than parent-child similarity indexes important environmental risk factors for externalizing disorders.

  3. Discrim: a computer program using an interactive approach to dissect a mixture of normal or lognormal distributions

    USGS Publications Warehouse

    Bridges, N.J.; McCammon, R.B.

    1980-01-01

    DISCRIM is an interactive computer graphics program that dissects mixtures of normal or lognormal distributions. The program was written in an effort to obtain a more satisfactory solution to the dissection problem than that offered by a graphical or numerical approach alone. It combines graphic and analytic techniques using a Tektronix1 terminal in a time-share computing environment. The main program and subroutines were written in the FORTRAN language. ?? 1980.

  4. The application of Gaussian mixture models for signal quantification in MALDI-TOF mass spectrometry of peptides.

    PubMed

    Spainhour, John Christian G; Janech, Michael G; Schwacke, John H; Velez, Juan Carlos Q; Ramakrishnan, Viswanathan

    2014-01-01

    Matrix assisted laser desorption/ionization time-of-flight (MALDI-TOF) coupled with stable isotope standards (SIS) has been used to quantify native peptides. This peptide quantification by MALDI-TOF approach has difficulties quantifying samples containing peptides with ion currents in overlapping spectra. In these overlapping spectra the currents sum together, which modify the peak heights and make normal SIS estimation problematic. An approach using Gaussian mixtures based on known physical constants to model the isotopic cluster of a known compound is proposed here. The characteristics of this approach are examined for single and overlapping compounds. The approach is compared to two commonly used SIS quantification methods for single compound, namely Peak Intensity method and Riemann sum area under the curve (AUC) method. For studying the characteristics of the Gaussian mixture method, Angiotensin II, Angiotensin-2-10, and Angiotenisn-1-9 and their associated SIS peptides were used. The findings suggest, Gaussian mixture method has similar characteristics as the two methods compared for estimating the quantity of isolated isotopic clusters for single compounds. All three methods were tested using MALDI-TOF mass spectra collected for peptides of the renin-angiotensin system. The Gaussian mixture method accurately estimated the native to labeled ratio of several isolated angiotensin peptides (5.2% error in ratio estimation) with similar estimation errors to those calculated using peak intensity and Riemann sum AUC methods (5.9% and 7.7%, respectively). For overlapping angiotensin peptides, (where the other two methods are not applicable) the estimation error of the Gaussian mixture was 6.8%, which is within the acceptable range. In summary, for single compounds the Gaussian mixture method is equivalent or marginally superior compared to the existing methods of peptide quantification and is capable of quantifying overlapping (convolved) peptides within the acceptable margin of error.

  5. Using a multinomial tree model for detecting mixtures in perceptual detection

    PubMed Central

    Chechile, Richard A.

    2014-01-01

    In the area of memory research there have been two rival approaches for memory measurement—signal detection theory (SDT) and multinomial processing trees (MPT). Both approaches provide measures for the quality of the memory representation, and both approaches provide for corrections for response bias. In recent years there has been a strong case advanced for the MPT approach because of the finding of stochastic mixtures on both target-present and target-absent tests. In this paper a case is made that perceptual detection, like memory recognition, involves a mixture of processes that are readily represented as a MPT model. The Chechile (2004) 6P memory measurement model is modified in order to apply to the case of perceptual detection. This new MPT model is called the Perceptual Detection (PD) model. The properties of the PD model are developed, and the model is applied to some existing data of a radiologist examining CT scans. The PD model brings out novel features that were absent from a standard SDT analysis. Also the topic of optimal parameter estimation on an individual-observer basis is explored with Monte Carlo simulations. These simulations reveal that the mean of the Bayesian posterior distribution is a more accurate estimator than the corresponding maximum likelihood estimator (MLE). Monte Carlo simulations also indicate that model estimates based on only the data from an individual observer can be improved upon (in the sense of being more accurate) by an adjustment that takes into account the parameter estimate based on the data pooled across all the observers. The adjustment of the estimate for an individual is discussed as an analogous statistical effect to the improvement over the individual MLE demonstrated by the James–Stein shrinkage estimator in the case of the multiple-group normal model. PMID:25018741

  6. Coleman Revisited: School Segregation, Peers, and Frog Ponds

    ERIC Educational Resources Information Center

    Goldsmith, Pat Rubio

    2011-01-01

    Students from minority segregated schools tend to achieve and attain less than similar students from White segregated schools. This study examines whether peer effects can explain this relationship using normative models and frog-pond models. Normative models (where peers become alike) suggest that minority schoolmates are a liability. Frog-pond…

  7. 31 CFR 50.90 - Cap on annual liability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Cap on annual liability. 50.90 Section 50.90 Money and Finance: Treasury Office of the Secretary of the Treasury TERRORISM RISK INSURANCE PROGRAM Cap on Annual Liability § 50.90 Cap on annual liability. Pursuant to Section 103 of the Act, if...

  8. 29 CFR 4062.7 - Calculating interest on liability and refunds of overpayments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... termination date, on any unpaid portion of the liability. Such interest accrues at the rate set forth in... amount of liability under this part, the PBGC shall refund the excess amount, with interest at the rate... compounded daily. (c) Interest rate. The interest rate on liability under this part and refunds thereof is...

  9. 26 CFR 1.338-5 - Adjusted grossed-up basis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... target. AGUB is the amount for which new target is deemed to have purchased all of its assets in the... (iii) The liabilities of new target. (2) Time and amount of AGUB—(i) Original determination. AGUB is...) Liabilities of new target—(1) In general. The liabilities of new target are the liabilities of target as of...

  10. 26 CFR 1.338-5 - Adjusted grossed-up basis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... target. AGUB is the amount for which new target is deemed to have purchased all of its assets in the... (iii) The liabilities of new target. (2) Time and amount of AGUB—(i) Original determination. AGUB is...) Liabilities of new target—(1) In general. The liabilities of new target are the liabilities of target as of...

  11. 26 CFR 1.338-5 - Adjusted grossed-up basis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the amount for which new target is deemed to have purchased all of its assets in the deemed purchase... (iii) The liabilities of new target. (2) Time and amount of AGUB—(i) Original determination. AGUB is...) Liabilities of new target—(1) In general. The liabilities of new target are the liabilities of target as of...

  12. 12 CFR 360.8 - Method for determining deposit and other liability account balances at a failed insured...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... FDIC will use to determine deposit and other liability account balances for insurance coverage and...

  13. 12 CFR 360.8 - Method for determining deposit and other liability account balances at a failed insured...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... FDIC will use to determine deposit and other liability account balances for insurance coverage and...

  14. 75 FR 48994 - Records Schedules; Availability and Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-12

    ..., asbestos liability, civil rights-- employment, civil rights--housing/accommodations, civil rights-- welfare..., product liability, asbestos liability, civil rights-- employment, civil rights--housing/accommodations...

  15. Influence of the normalized ion flux on the constitution of alumina films deposited by plasma-assisted chemical vapor deposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurapov, Denis; Reiss, Jennifer; Trinh, David H.

    2007-07-15

    Alumina thin films were deposited onto tempered hot working steel substrates from an AlCl{sub 3}-O{sub 2}-Ar-H{sub 2} gas mixture by plasma-assisted chemical vapor deposition. The normalized ion flux was varied during deposition through changes in precursor content while keeping the cathode voltage and the total pressure constant. As the precursor content in the total gas mixture was increased from 0.8% to 5.8%, the deposition rate increased 12-fold, while the normalized ion flux decreased by approximately 90%. The constitution, morphology, impurity incorporation, and the elastic properties of the alumina thin films were found to depend on the normalized ion flux. Thesemore » changes in structure, composition, and properties induced by normalized ion flux may be understood by considering mechanisms related to surface and bulk diffusion.« less

  16. Evaluating cardiac risk: exposure response analysis in early clinical drug development.

    PubMed

    Grenier, Julie; Paglialunga, Sabina; Morimoto, Bruce H; Lester, Robert M

    2018-01-01

    The assessment of a drug's cardiac liability has undergone considerable metamorphosis by regulators since International Council for Harmonization of Technical Requirement for Pharmaceuticals for Human Use E14 guideline was introduced in 2005. Drug developers now have a choice in how proarrhythmia risk can be evaluated; the options include a dedicated thorough QT (TQT) study or exposure response (ER) modeling of intensive electrocardiogram (ECG) captured in early clinical development. The alternative approach of ER modeling was incorporated into a guidance document in 2015 as a primary analysis tool which could be utilized in early phase dose escalation studies as an option to perform a dedicated TQT trial. This review will describe the current state of ER modeling of intensive ECG data collected during early clinical drug development; the requirements with regard to the use of a positive control; and address the challenges and opportunities of this alternative approach to assessing QT liability.

  17. Relationship Between Speed of Sound in and Density of Normal and Diseased Rat Livers

    NASA Astrophysics Data System (ADS)

    Hachiya, Hiroyuki; Ohtsuki, Shigeo; Tanaka, Motonao

    1994-05-01

    Speed of sound is an important acoustic parameter for quantitative characterization of living tissues. In this paper, the relationship between speed of sound in and density of rat liver tissues are investigated. The speed of sound was measured by the nondeformable technique based on frequency-time analysis of a 3.5 MHz pulse response. The speed of sound in normal livers varied minimally between individuals and was not related to body weight or age. In liver tissues which were administered CCl4, the speed of sound was lower than the speed of sound in normal tissues. The relationship between speed of sound and density in normal, fatty and cirrhotic livers can be fitted well on the line which is estimated using the immiscible liquid model assuming a mixture of normal liver and fat tissues. For 3.5 MHz ultrasound, it is considered that the speed of sound in fresh liver with fatty degeneration is responsible for the fat content and is not strongly dependent on the degree of fibrosis.

  18. 241Am Ingrowth and Its Effect on Internal Dose

    DOE PAGES

    Konzen, Kevin

    2016-07-01

    Generally, plutonium has been manufactured to support commercial and military applications involving heat sources, weapons and reactor fuel. This work focuses on three typical plutonium mixtures, while observing the potential of 241Am ingrowth and its effect on internal dose. The term “ingrowth” is used to describe 241Am production due solely from the decay of 241Pu as part of a plutonium mixture, where it is initially absent or present in a smaller quantity. Dose calculation models do not account for 241Am ingrowth unless the 241Pu quantity is specified. This work suggested that 241Am ingrowth be considered in bioassay analysis when theremore » is a potential of a 10% increase to the individual’s committed effective dose. It was determined that plutonium fuel mixtures, initially absent of 241Am, would likely exceed 10% for typical reactor grade fuel aged less than 30 years; however, heat source grade and aged weapons grade fuel would normally fall below this threshold. In conclusion, although this work addresses typical plutonium mixtures following separation, it may be extended to irradiated commercial uranium fuel and is expected to be a concern in the recycling of spent fuel.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konzen, Kevin

    Generally, plutonium has been manufactured to support commercial and military applications involving heat sources, weapons and reactor fuel. This work focuses on three typical plutonium mixtures, while observing the potential of 241Am ingrowth and its effect on internal dose. The term “ingrowth” is used to describe 241Am production due solely from the decay of 241Pu as part of a plutonium mixture, where it is initially absent or present in a smaller quantity. Dose calculation models do not account for 241Am ingrowth unless the 241Pu quantity is specified. This work suggested that 241Am ingrowth be considered in bioassay analysis when theremore » is a potential of a 10% increase to the individual’s committed effective dose. It was determined that plutonium fuel mixtures, initially absent of 241Am, would likely exceed 10% for typical reactor grade fuel aged less than 30 years; however, heat source grade and aged weapons grade fuel would normally fall below this threshold. In conclusion, although this work addresses typical plutonium mixtures following separation, it may be extended to irradiated commercial uranium fuel and is expected to be a concern in the recycling of spent fuel.« less

  20. 12 CFR 303.15 - Certain limited liability companies deemed incorporated under State law.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Certain limited liability companies deemed... liability companies deemed incorporated under State law. (a) For purposes of the definition of “State bank... liability company (LLC) under the law of any State is deemed to be “incorporated” under the law of the State...

  1. 26 CFR 1.934-1 - Limitation on reduction in income tax liability incurred to the Virgin Islands.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Islands will be computed as follows: (A) Add to the income tax liability incurred to the Virgin Islands...) Add to the income tax liability incurred to the Virgin Islands any credit against the tax allowed... 26 Internal Revenue 10 2010-04-01 2010-04-01 false Limitation on reduction in income tax liability...

  2. 26 CFR 1.934-1 - Limitation on reduction in income tax liability incurred to the Virgin Islands.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Islands will be computed as follows: (A) Add to the income tax liability incurred to the Virgin Islands...) Add to the income tax liability incurred to the Virgin Islands any credit against the tax allowed... 26 Internal Revenue 10 2013-04-01 2013-04-01 false Limitation on reduction in income tax liability...

  3. 26 CFR 1.934-1 - Limitation on reduction in income tax liability incurred to the Virgin Islands.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Islands will be computed as follows: (A) Add to the income tax liability incurred to the Virgin Islands...) Add to the income tax liability incurred to the Virgin Islands any credit against the tax allowed... 26 Internal Revenue 10 2011-04-01 2011-04-01 false Limitation on reduction in income tax liability...

  4. 26 CFR 1.934-1 - Limitation on reduction in income tax liability incurred to the Virgin Islands.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Islands will be computed as follows: (A) Add to the income tax liability incurred to the Virgin Islands...) Add to the income tax liability incurred to the Virgin Islands any credit against the tax allowed... 26 Internal Revenue 10 2012-04-01 2012-04-01 false Limitation on reduction in income tax liability...

  5. 26 CFR 1.934-1 - Limitation on reduction in income tax liability incurred to the Virgin Islands.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Islands will be computed as follows: (A) Add to the income tax liability incurred to the Virgin Islands...) Add to the income tax liability incurred to the Virgin Islands any credit against the tax allowed... 26 Internal Revenue 10 2014-04-01 2013-04-01 true Limitation on reduction in income tax liability...

  6. 29 CFR 4219.13 - Amount of liability for de minimis amounts.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 9 2011-07-01 2011-07-01 false Amount of liability for de minimis amounts. 4219.13 Section... Redetermination of Withdrawal Liability Upon Mass Withdrawal § 4219.13 Amount of liability for de minimis amounts. An employer that is liable for de minimis amounts shall be liable to the plan for the amount by which...

  7. 29 CFR 4219.13 - Amount of liability for de minimis amounts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Amount of liability for de minimis amounts. 4219.13 Section... Redetermination of Withdrawal Liability Upon Mass Withdrawal § 4219.13 Amount of liability for de minimis amounts. An employer that is liable for de minimis amounts shall be liable to the plan for the amount by which...

  8. 14 CFR § 1266.104 - Cross-waiver of liability for launch agreements for science or space exploration activities...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... agreements for science or space exploration activities unrelated to the International Space Station. § 1266...-WAIVER OF LIABILITY § 1266.104 Cross-waiver of liability for launch agreements for science or space... implement a cross-waiver of liability between the parties to agreements for NASA's science or space...

  9. Perception of legal liability by registered nurses in Korea.

    PubMed

    Kim, Ki-Kyong; Kim, In-Sook; Lee, Won-Hee

    2007-08-01

    Liability to the nursing profession is imperative if nurses are to act as an autonomous body. Assessing and communicating effectively is a vital part of nursing for patient safety. This study was designed to identify the attitudes of Korean nurses toward liability in assessment and communication and to investigate the relationship among the variables (i.e., legal awareness, attitudes toward doctor's duty to supervise nurses). The attitudes toward doctor's duty reflect the status of nurses' dependency on doctor's supervision. The study participants were 288 registered nurses in RN-BSN courses at two colleges in Korea. The level of legal awareness was measured using a 25-item Legal Awareness Questionnaire developed by the authors. The measuring instrument for attitudes toward doctor's duty to supervise nurses and nurses' liability was the Attitude toward Duty and Liability Questionnaire, which was modified by the authors. There were significant correlation between attitude toward doctor's duty and nurses' liability, but not between legal awareness and liability attitude. The results of this study suggest that the present educational content aimed at improving liability attitudes of nurses should be refocused with attitude-oriented education and should include an understanding of the increased accountability that comes with greater autonomy in nursing practice.

  10. Malpractice Liability Risk and Use of Diagnostic Imaging Services: A Systematic Review of the Literature.

    PubMed

    Li, Suhui; Brantley, Erin

    2015-12-01

    A widespread concern among physicians is that fear of medical malpractice liability may affect their decisions for diagnostic imaging orders. The purpose of this article is to synthesize evidence regarding the defensive use of imaging services. A literature search was conducted using a number of databases. The review included peer-reviewed publications that studied the link between physician orders of imaging tests and malpractice liability pressure. We identified 13 peer-reviewed studies conducted in the United States. Five of the studies reported physician assessments of the role of defensive medicine in imaging-order decisions; five assessed the association between physicians' liability risk and imaging ordering, and three assessed the impact of liability risk on imaging ordering at the state level. Although the belief that medical liability risk could influence decisions is highly prevalent among physicians, findings are mixed regarding the impact of liability risk on imaging orders at both the state and physician level. Inconclusive evidence suggests that physician ordering of imaging tests is affected by malpractice liability risk. Further research is needed to disentangle defensive medicine from other reasons for inefficient use of imaging. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  11. IQ and schizophrenia in a Swedish national sample: their causal relationship and the interaction of IQ with genetic risk.

    PubMed

    Kendler, Kenneth S; Ohlsson, Henrik; Sundquist, Jan; Sundquist, Kristina

    2015-03-01

    The authors sought to clarify the relationship between IQ and subsequent risk for schizophrenia. IQ was assessed at ages 18-20 in 1,204,983 Swedish males born between 1951 and 1975. Schizophrenia was assessed by hospital diagnosis through 2010. Cox proportional hazards models were used to investigate future risk for schizophrenia in individuals as a function of their IQ score, and then stratified models using pairs of relatives were used to adjust for familial cluster. Finally, regression models were used to examine the interaction between IQ and genetic liability on risk for schizophrenia. IQ had a monotonic relationship with schizophrenia risk across the IQ range, with a mean increase in risk of 3.8% per 1-point decrease in IQ; this association was stronger in the lower than the higher IQ range. Co-relative control analyses showed a similar association between IQ and schizophrenia in the general population and in cousin, half-sibling, and full-sibling pairs. A robust interaction was seen between genetic liability to schizophrenia and IQ in predicting schizophrenia risk. Genetic susceptibility for schizophrenia had a much stronger impact on risk of illness for those with low than high intelligence. The IQ-genetic liability interaction arose largely from IQ differences between close relatives. IQ assessed in late adolescence is a robust risk factor for subsequent onset of schizophrenia. This association is not the result of a declining IQ associated with insidious onset. In this large, representative sample, we found no evidence for a link between genius and schizophrenia. Co-relative control analyses showed that the association between lower IQ and schizophrenia is not the result of shared familial risk factors and may be causal. The strongest effect was seen with IQ differences within families. High intelligence substantially attenuates the impact of genetic liability on the risk for schizophrenia.

  12. Oxidative transformation of tunichromes - Model studies with 1,2-dehydro-N-acetyldopamine and N-acetylcysteine.

    PubMed

    Kuang, Qun F; Abebe, Adal; Evans, Jason; Sugumaran, Manickam

    2017-08-01

    Tunichromes are 1,2-dehydrodopa containing bioactive peptidyl derivatives found in blood cells of several tunicates. They have been implicated in metal sequestering, tunic formation, wound healing and defense reaction. Earlier studies conducted on these compounds indicate their extreme liability, high reactivity and easy oxidative polymerization. Their reactions are also complicated by the presence of multiple dehydrodopyl units. Since they have been invoked in crosslinking and covalent binding, to understand the reactivities of these novel compounds, we have taken a simple model compound that possess the tunichrome reactive group viz., 1,2-dehydro-N-acetyldopamine (Dehydro NADA) and examined its reaction with N-acetylcysteine in presence of oxygen under both enzymatic and nonenzymatic conditions. Ultraviolet and visible spectral studies of reaction mixtures containing dehydro NADA and N-acetylcysteine in different molar ratios indicated the production of side chain and ring adducts of N-acetylcysteine to dehydro NADA. Liquid chromatography and mass spectral studies supported this contention and confirmed the production of several different products. Mass spectral analysis of these products show the potentials of dehydro NADA to form side chain adducts that can lead to polymeric products. This is the first report demonstrating the ability of dehydro dopyl units to form adducts and crosslinks with amino acid side chains. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Mechanisms underlying the lifetime co-occurrence of tobacco and cannabis use in adolescent and young adult twins

    PubMed Central

    Agrawal, Arpana; Silberg, Judy L.; Lynskey, Michael T.; Maes, Hermine H.; Eaves, Lindon J.

    2009-01-01

    Using twins assessed during adolescence (Virginia Twin Study of Adolescent Behavioral Development: 8–17 years) and followed up in early adulthood (Young Adult Follow-Up, 18–27 years), we tested 13 genetically informative models of co-occurrence, adapted for the inclusion of covariates. Models were fit, in Mx, to data at both assessments allowing for a comparison of the mechanisms that underlie the lifetime co-occurrence of cannabis and tobacco use in adolescence and early adulthood. Both cannabis and tobacco use were influenced by additive genetic (38–81%) and non-shared environmental factors with the possible role of non-shared environment in the adolescent assessment only. Causation models, where liability to use cannabis exerted a causal influence on the liability to use tobacco fit the adolescent data best, while the reverse causation model (tobacco causes cannabis) fit the early adult data best. Both causation models (cannabis to tobacco and tobacco to cannabis) and the correlated liabilities model fit data from the adolescent and young adult assessments well. Genetic correlations (0.59–0.74) were moderate. Therefore, the relationship between cannabis and tobacco use is fairly similar during adolescence and early adulthood with reciprocal influences across the two psychoactive substances. However, our study could not exclude the possibility that ‘gateways’ and ‘reverse gateways’, particularly within a genetic context, exist, such that predisposition to using one substance (cannabis or tobacco) modifies predisposition to using the other. Given the high addictive potential of nicotine and the ubiquitous nature of cannabis use, this is a public health concern worthy of considerable attention. PMID:20047801

  14. 7 CFR 1773.45 - Regulatory liabilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... § 1773.45 Regulatory liabilities. The CPA's workpapers must document whether all regulatory liabilities comply with the requirements of SFAS No. 71. For electric borrowers only, the CPA's workpapers must...

  15. 7 CFR 1773.45 - Regulatory liabilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... § 1773.45 Regulatory liabilities. The CPA's workpapers must document whether all regulatory liabilities comply with the requirements of SFAS No. 71. For electric borrowers only, the CPA's workpapers must...

  16. 7 CFR 1773.45 - Regulatory liabilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... § 1773.45 Regulatory liabilities. The CPA's workpapers must document whether all regulatory liabilities comply with the requirements of SFAS No. 71. For electric borrowers only, the CPA's workpapers must...

  17. 7 CFR 1773.45 - Regulatory liabilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... § 1773.45 Regulatory liabilities. The CPA's workpapers must document whether all regulatory liabilities comply with the requirements of SFAS No. 71. For electric borrowers only, the CPA's workpapers must...

  18. 7 CFR 1773.45 - Regulatory liabilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... § 1773.45 Regulatory liabilities. The CPA's workpapers must document whether all regulatory liabilities comply with the requirements of SFAS No. 71. For electric borrowers only, the CPA's workpapers must...

  19. Co-pyrolysis kinetics of sewage sludge and bagasse using multiple normal distributed activation energy model (M-DAEM).

    PubMed

    Lin, Yan; Chen, Zhihao; Dai, Minquan; Fang, Shiwen; Liao, Yanfen; Yu, Zhaosheng; Ma, Xiaoqian

    2018-07-01

    In this study, the kinetic models of bagasse, sewage sludge and their mixture were established by the multiple normal distributed activation energy model. Blending with sewage sludge, the initial temperature declined from 437 K to 418 K. The pyrolytic species could be divided into five categories, including analogous hemicelluloses I, hemicelluloses II, cellulose, lignin and bio-char. In these species, the average activation energies and the deviations situated at reasonable ranges of 166.4673-323.7261 kJ/mol and 0.1063-35.2973 kJ/mol, respectively, which were conformed to the references. The kinetic models were well matched to experimental data, and the R 2 were greater than 99.999%y. In the local sensitivity analysis, the distributed average activation energy had stronger effect on the robustness than other kinetic parameters. And the content of pyrolytic species determined which series of kinetic parameters were more important. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Quantitative genetic analysis of injury liability in infants and toddlers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries.more » Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.« less

  1. National Costs Of The Medical Liability System

    PubMed Central

    Mello, Michelle M.; Chandra, Amitabh; Gawande, Atul A.; Studdert, David M.

    2011-01-01

    Concerns about reducing the rate of growth of health expenditures have reignited interest in medical liability reforms and their potential to save money by reducing the practice of defensive medicine. It is not easy to estimate the costs of the medical liability system, however. This article identifies the various components of liability system costs, generates national estimates for each component, and discusses the level of evidence available to support the estimates. Overall annual medical liability system costs, including defensive medicine, are estimated to be $55.6 billion in 2008 dollars, or 2.4 percent of total health care spending. PMID:20820010

  2. 75 FR 76946 - Demurrage Liability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ... because the warehouseman--which otherwise has no incentive to agree to liability--can avoid liability... based on an unjust enrichment theory? The court rejected such an approach in Middle Atlantic, 353 F...

  3. Dynamic imaging of adaptive stress response pathway activation for prediction of drug induced liver injury.

    PubMed

    Wink, Steven; Hiemstra, Steven W; Huppelschoten, Suzanne; Klip, Janna E; van de Water, Bob

    2018-05-01

    Drug-induced liver injury remains a concern during drug treatment and development. There is an urgent need for improved mechanistic understanding and prediction of DILI liabilities using in vitro approaches. We have established and characterized a panel of liver cell models containing mechanism-based fluorescent protein toxicity pathway reporters to quantitatively assess the dynamics of cellular stress response pathway activation at the single cell level using automated live cell imaging. We have systematically evaluated the application of four key adaptive stress pathway reporters for the prediction of DILI liability: SRXN1-GFP (oxidative stress), CHOP-GFP (ER stress/UPR response), p21 (p53-mediated DNA damage-related response) and ICAM1 (NF-κB-mediated inflammatory signaling). 118 FDA-labeled drugs in five human exposure relevant concentrations were evaluated for reporter activation using live cell confocal imaging. Quantitative data analysis revealed activation of single or multiple reporters by most drugs in a concentration and time dependent manner. Hierarchical clustering of time course dynamics and refined single cell analysis allowed the allusion of key events in DILI liability. Concentration response modeling was performed to calculate benchmark concentrations (BMCs). Extracted temporal dynamic parameters and BMCs were used to assess the predictive power of sub-lethal adaptive stress pathway activation. Although cellular adaptive responses were activated by non-DILI and severe-DILI compounds alike, dynamic behavior and lower BMCs of pathway activation were sufficiently distinct between these compound classes. The high-level detailed temporal- and concentration-dependent evaluation of the dynamics of adaptive stress pathway activation adds to the overall understanding and prediction of drug-induced liver liabilities.

  4. Sensitivity of chloride efflux vs. transepithelial measurements in mixed CF and normal airway epithelial cell populations.

    PubMed

    Illek, Beate; Lei, Dachuan; Fischer, Horst; Gruenert, Dieter C

    2010-01-01

    While the Cl(-) efflux assays are relatively straightforward, their ability to assess the efficacy of phenotypic correction in cystic fibrosis (CF) tissue or cells may be limited. Accurate assessment of therapeutic efficacy, i.e., correlating wild type CF transmembrane conductance regulator (CFTR) levels with phenotypic correction in tissue or individual cells, requires a sensitive assay. Radioactive chloride ((36)Cl) efflux was compared to Ussing chamber analysis for measuring cAMP-dependent Cl(-) transport in mixtures of human normal (16HBE14o-) and cystic fibrosis (CF) (CFTE29o- or CFBE41o-, respectively) airway epithelial cells. Cell mixtures with decreasing amounts of 16HBE14o- cells were evaluated. Efflux and Ussing chamber studies on mixed populations of normal and CF airway epithelial cells showed that, as the number of CF cells within the population was progressively increased, the cAMP-dependent Cl(-) decreased. The (36)Cl efflux assay was effective for measuring Cl(-) transport when ≥ 25% of the cells were normal. If < 25% of the cells were phenotypically wild-type (wt), the (36)Cl efflux assay was no longer reliable. Polarized CFBE41o- cells, also homozygous for the ΔF508 mutation, were used in the Ussing chamber studies. Ussing analysis detected cAMP-dependent Cl(-) currents in mixtures with ≥1% wild-type cells indicating that Ussing analysis is more sensitive than (36)Cl efflux analysis for detection of functional CFTR. Assessment of CFTR function by Ussing analysis is more sensitive than (36)Cl efflux analysis. Ussing analysis indicates that cell mixtures containing 10% 16HBE14o- cells showed 40-50% of normal cAMP-dependent Cl(-) transport that drops off exponentially between 10-1% wild-type cells. Copyright © 2010 S. Karger AG, Basel.

  5. The non-trusty clown attack on model-based speaker recognition systems

    NASA Astrophysics Data System (ADS)

    Farrokh Baroughi, Alireza; Craver, Scott

    2015-03-01

    Biometric detectors for speaker identification commonly employ a statistical model for a subject's voice, such as a Gaussian Mixture Model, that combines multiple means to improve detector performance. This allows a malicious insider to amend or append a component of a subject's statistical model so that a detector behaves normally except under a carefully engineered circumstance. This allows an attacker to force a misclassification of his or her voice only when desired, by smuggling data into a database far in advance of an attack. Note that the attack is possible if attacker has access to database even for a limited time to modify victim's model. We exhibit such an attack on a speaker identification, in which an attacker can force a misclassification by speaking in an unusual voice, and replacing the least weighted component of victim's model by the most weighted competent of the unusual voice of the attacker's model. The reason attacker make his or her voice unusual during the attack is because his or her normal voice model can be in database, and by attacking with unusual voice, the attacker has the option to be recognized as himself or herself when talking normally or as the victim when talking in the unusual manner. By attaching an appropriately weighted vector to a victim's model, we can impersonate all users in our simulations, while avoiding unwanted false rejections.

  6. Error reduction and representation in stages (ERRIS) in hydrological modelling for ensemble streamflow forecasting

    NASA Astrophysics Data System (ADS)

    Li, Ming; Wang, Q. J.; Bennett, James C.; Robertson, David E.

    2016-09-01

    This study develops a new error modelling method for ensemble short-term and real-time streamflow forecasting, called error reduction and representation in stages (ERRIS). The novelty of ERRIS is that it does not rely on a single complex error model but runs a sequence of simple error models through four stages. At each stage, an error model attempts to incrementally improve over the previous stage. Stage 1 establishes parameters of a hydrological model and parameters of a transformation function for data normalization, Stage 2 applies a bias correction, Stage 3 applies autoregressive (AR) updating, and Stage 4 applies a Gaussian mixture distribution to represent model residuals. In a case study, we apply ERRIS for one-step-ahead forecasting at a range of catchments. The forecasts at the end of Stage 4 are shown to be much more accurate than at Stage 1 and to be highly reliable in representing forecast uncertainty. Specifically, the forecasts become more accurate by applying the AR updating at Stage 3, and more reliable in uncertainty spread by using a mixture of two Gaussian distributions to represent the residuals at Stage 4. ERRIS can be applied to any existing calibrated hydrological models, including those calibrated to deterministic (e.g. least-squares) objectives.

  7. Contaminant source identification using semi-supervised machine learning

    NASA Astrophysics Data System (ADS)

    Vesselinov, Velimir V.; Alexandrov, Boian S.; O'Malley, Daniel

    2018-05-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).

  8. Contaminant source identification using semi-supervised machine learning

    DOE PAGES

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    2017-11-08

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  9. Contaminant source identification using semi-supervised machine learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  10. Estimating wetland vegetation abundance from Landsat-8 operational land imager imagery: a comparison between linear spectral mixture analysis and multinomial logit modeling methods

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Gong, Zhaoning; Zhao, Wenji; Pu, Ruiliang; Liu, Ke

    2016-01-01

    Mapping vegetation abundance by using remote sensing data is an efficient means for detecting changes of an eco-environment. With Landsat-8 operational land imager (OLI) imagery acquired on July 31, 2013, both linear spectral mixture analysis (LSMA) and multinomial logit model (MNLM) methods were applied to estimate and assess the vegetation abundance in the Wild Duck Lake Wetland in Beijing, China. To improve mapping vegetation abundance and increase the number of endmembers in spectral mixture analysis, normalized difference vegetation index was extracted from OLI imagery along with the seven reflective bands of OLI data for estimating the vegetation abundance. Five endmembers were selected, which include terrestrial plants, aquatic plants, bare soil, high albedo, and low albedo. The vegetation abundance mapping results from Landsat OLI data were finally evaluated by utilizing a WorldView-2 multispectral imagery. Similar spatial patterns of vegetation abundance produced by both fully constrained LSMA algorithm and MNLM methods were observed: higher vegetation abundance levels were distributed in agricultural and riparian areas while lower levels in urban/built-up areas. The experimental results also indicate that the MNLM model outperformed the LSMA algorithm with smaller root mean square error (0.0152 versus 0.0252) and higher coefficient of determination (0.7856 versus 0.7214) as the MNLM model could handle the nonlinear reflection phenomenon better than the LSMA with mixed pixels.

  11. Admixture analysis of age at onset in first episode bipolar disorder.

    PubMed

    Nowrouzi, Behdin; McIntyre, Roger S; MacQueen, Glenda; Kennedy, Sidney H; Kennedy, James L; Ravindran, Arun; Yatham, Lakshmi; De Luca, Vincenzo

    2016-09-01

    Many studies have used the admixture analysis to separate age-at-onset (AAO) subgroups in bipolar disorder, but none of them examined first episode patients. The purpose of this study was to investigate the influence of clinical variables on AAO in first episode bipolar patients. The admixture analysis was applied to identify the model best fitting the observed AAO distribution of a sample of 194 patients with DSM-IV diagnosis of bipolar disorder and the finite mixture model was applied to assess the effect of clinical covariates on AAO. Using the BIC method, the model that was best fitting the observed distribution of AAO was a mixture of three normal distributions. We identified three AAO groups: early age-at-onset (EAO) (µ=18.0, σ=2.88), intermediate-age-at-onset (IAO) (µ=28.7, σ=3.5), and late-age-at-onset (LAO) (µ=47.3, σ=7.8), comprising 69%, 22%, and 9% of the sample respectively. Our first episode sample distribution model was significantly different from most of the other studies that applied the mixture analysis. The main limitation is that our sample may have inadequate statistical power to detect the clinical associations with the AAO subgroups. This study confirms that bipolar disorder can be classified into three groups based on AAO distribution. The data reported in our paper provide more insight into the diagnostic heterogeneity of bipolar disorder across the three AAO subgroups. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Association of Liability Concerns with Decisions to Order Echocardiography and Cardiac Stress Tests with Imaging.

    PubMed

    Kini, Vinay; Weiner, Rory B; McCarthy, Fenton H; Wiegers, Susan E; Kirkpatrick, James N

    2016-12-01

    Professional societies have made efforts to curb overuse of cardiac imaging and decrease practice variation by publishing appropriate use criteria. However, little is known about the impact of physician-level determinants such as liability concerns and risk aversion on decisions to order testing. A web-based survey was administered to cardiologists and general practice physicians affiliated with two academic institutions. The survey consisted of four clinical scenarios in which appropriate use criteria rated echocardiography or stress testing as "may be appropriate." Respondents' degree of liability concerns and risk aversion were measured using validated tools. The primary outcome variable was tendency to order imaging, calculated as the average likelihood to order an imaging test across the clinical scenarios (1 = very unlikely, 6 = very likely). Linear regression models were used to evaluate the association between tendency to order imaging and physician characteristics. From 420 physicians invited to participate, 108 complete responses were obtained (26% response rate, 54% cardiologists). There was no difference in tendency to order imaging between cardiologists and general practice physicians (3.46 [95% CI, 3.12-3.81] vs 3.15 [95% CI, 2.79-3.51], P = .22). On multivariate analysis, a higher degree of liability concerns was the only significant predictor of decisions to order imaging (mean difference in tendency to order imaging, 0.36; 95% CI, 0.09-0.62; P = .01). In clinical situations in which performance of cardiac imaging is rated as "may be appropriate" by appropriate use criteria, physicians with higher liability concerns ordered significantly more testing than physicians with lower concerns. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  13. Lung cancer treatment costs, including patient responsibility, by disease stage and treatment modality, 1992 to 2003.

    PubMed

    Cipriano, Lauren E; Romanus, Dorothy; Earle, Craig C; Neville, Bridget A; Halpern, Elkan F; Gazelle, G Scott; McMahon, Pamela M

    2011-01-01

    The objective of this analysis was to estimate costs for lung cancer care and evaluate trends in the share of treatment costs that are the responsibility of Medicare beneficiaries. The Surveillance, Epidemiology, and End Results (SEER)-Medicare data from 1991-2003 for 60,231 patients with lung cancer were used to estimate monthly and patient-liability costs for clinical phases of lung cancer (prediagnosis, staging, initial, continuing, and terminal), stratified by treatment, stage, and non-small- versus small-cell lung cancer. Lung cancer-attributable costs were estimated by subtracting each patient's own prediagnosis costs. Costs were estimated as the sum of Medicare reimbursements (payments from Medicare to the service provider), co-insurance reimbursements, and patient-liability costs (deductibles and "co-payments" that are the patient's responsibility). Costs and patient-liability costs were fit with regression models to compare trends by calendar year, adjusting for age at diagnosis. The monthly treatment costs for a 72-year-old patient, diagnosed with lung cancer in 2000, in the first 6 months ranged from $2687 (no active treatment) to $9360 (chemo-radiotherapy); costs varied by stage at diagnosis and histologic type. Patient liability represented up to 21.6% of care costs and increased over the period 1992-2003 for most stage and treatment categories, even when care costs decreased or remained unchanged. The greatest monthly patient liability was incurred by chemo-radiotherapy patients, which ranged from $1617 to $2004 per month across cancer stages. Costs for lung cancer care are substantial, and Medicare is paying a smaller proportion of the total cost over time. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. Heritable Variation, With Little or No Maternal Effect, Accounts for Recurrence Risk to Autism Spectrum Disorder in Sweden.

    PubMed

    Yip, Benjamin Hon Kei; Bai, Dan; Mahjani, Behrang; Klei, Lambertus; Pawitan, Yudi; Hultman, Christina M; Grice, Dorothy E; Roeder, Kathryn; Buxbaum, Joseph D; Devlin, Bernie; Reichenberg, Abraham; Sandin, Sven

    2018-04-01

    Autism spectrum disorder (ASD) has both genetic and environmental origins, including potentially maternal effects. Maternal effects describe the association of one or more maternal phenotypes with liability to ASD in progeny that are independent of maternally transmitted risk alleles. While maternal effects could play an important role, consistent with association to maternal traits such as immune status, no study has estimated maternal, additive genetic, and environmental effects in ASD. Using a population-based sample consisting of all children born in Sweden from 1998 to 2007 and their relatives, we fitted statistical models to family data to estimate the variance in ASD liability originating from maternal, additive genetic, and shared environmental effects. We calculated sibling and cousin family recurrence risk ratio as a direct measure of familial, genetic, and environmental risk factors and repeated the calculations on diagnostic subgroups, specifically autistic disorder (AD) and spectrum disorder (SD), which included Asperger's syndrome and/or pervasive developmental disorder not otherwise specified. The sample consisted of 776,212 children of whom 11,231 had a diagnosis of ASD: 4554 with AD, 6677 with SD. We found support for large additive genetic contribution to liability; heritability (95% confidence interval [CI]) was estimated to 84.8% (95% CI: 73.1-87.3) for ASD, 79.6% (95% CI: 61.2-85.1) for AD, and 76.4% (95% CI: 63.0-82.5) for SD. There was modest, if any, contribution of maternal effects to liability for ASD, including subtypes AD and SD, and there was no support for shared environmental effects. These results show liability to ASD arises largely from additive genetic variation. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  15. Familial liability to psychosis is a risk factor for multimorbidity in people with psychotic disorders and their unaffected siblings.

    PubMed

    Islam, M A; Khan, M F H; Quee, P J; Snieder, H; van den Heuvel, E R; Bruggeman, R; Alizadeh, B Z

    2017-09-01

    Multimorbidity may impose an overwhelming burden on patients with psychosis and is affected by gender and age. Our aim is to study the independent role of familial liability to psychosis as a risk factor for multimorbidity. We performed the study within the framework of the Genetic Risk and Outcome of Psychosis (GROUP) project. Overall, we compared 1024 psychotic patients, 994 unaffected siblings and 566 controls on the prevalence of 125 lifetime diseases, and 19 self-reported somatic complaints. Multimorbidity was defined as the presence of two or more complaints/diseases in the same individual. Generalized linear mixed model (GLMM) were used to investigate the effects of gender, age (adolescent, young, older) and familial liability (patients, siblings, controls) and their interactions on multimorbidity. Familial liability had a significant effect on multimorbidity of either complaints or diseases. Patients had a higher prevalence of multimorbidity of complaints compared to siblings (OR 2.20, 95% CI 1.79-2.69, P<0.001) and to controls (3.05, 2.35-3.96, P<0.001). In physical health multimorbidity, patients (OR 1.36, 95% CI 1.05-1.75, P=0.018), but not siblings, had significantly higher prevalence than controls. Similar finding were observed for multimorbidity of lifetime diseases, including psychiatric diseases. Significant results were observed for complaints and disease multimorbidity across gender and age groups. Multimorbidity is a common burden, significantly more prevalent in patients and their unaffected siblings. Familial liability to psychosis showed an independent effect on multimorbidity; gender and age are also important factors determining multimorbidity. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  16. Impact of co-administration of oxycodone and smoked cannabis on analgesia and abuse liability.

    PubMed

    Cooper, Ziva D; Bedi, Gillinder; Ramesh, Divya; Balter, Rebecca; Comer, Sandra D; Haney, Margaret

    2018-02-05

    Cannabinoids combined with opioids produce synergistic antinociceptive effects, decreasing the lowest effective antinociceptive opioid dose (i.e., opioid-sparing effects) in laboratory animals. Although pain patients report greater analgesia when cannabis is used with opioids, no placebo-controlled studies have assessed the direct effects of opioids combined with cannabis in humans or the impact of the combination on abuse liability. This double-blind, placebo-controlled, within-subject study determined if cannabis enhances the analgesic effects of low dose oxycodone using a validated experimental model of pain and its effects on abuse liability. Healthy cannabis smokers (N = 18) were administered oxycodone (0, 2.5, and 5.0 mg, PO) with smoked cannabis (0.0, 5.6% Δ 9 tetrahydrocannabinol [THC]) and analgesia was assessed using the Cold-Pressor Test (CPT). Participants immersed their hand in cold water (4 °C); times to report pain (pain threshold) and withdraw the hand from the water (pain tolerance) were recorded. Abuse-related effects were measured and effects of oxycodone on cannabis self-administration were determined. Alone, 5.0 mg oxycodone increased pain threshold and tolerance (p ≤ 0.05). Although active cannabis and 2.5 mg oxycodone alone failed to elicit analgesia, combined they increased pain threshold and tolerance (p ≤ 0.05). Oxycodone did not increase subjective ratings associated with cannabis abuse, nor did it increase cannabis self-administration. However, the combination of 2.5 mg oxycodone and active cannabis produced small, yet significant, increases in oxycodone abuse liability (p ≤ 0.05). Cannabis enhances the analgesic effects of sub-threshold oxycodone, suggesting synergy, without increases in cannabis's abuse liability. These findings support future research into the therapeutic use of opioid-cannabinoid combinations for pain.

  17. Psychometric modeling of abuse and dependence symptoms across six illicit substances indicates novel dimensions of misuse

    PubMed Central

    Clark, Shaunna L.; Gillespie, Nathan A.; Adkins, Daniel E.; Kendler, Kenneth S.; Neale, Michael C.

    2015-01-01

    Aims This study explored the factor structure of DSM III-R/IV symptoms for substance abuse and dependence across six illicit substance categories in a population-based sample of males. Method DSM III-R/IV drug abuse and dependence symptoms for cannabis, sedatives, stimulants, cocaine, opioids and hallucinogens from 4179 males born 1940-1970 from the population-based Virginia Adult Twin Study of Psychiatric and Substance Use Disorders were analyzed. Confirmatory factor analyses tested specific hypotheses regarding the latent structure of substance misuse for a comprehensive battery of 13 misuse symptoms measured across six illicit substance categories (78 items). Results Among the models fit, the latent structure of substance misuse was best represented by a combination of substance-specific factors and misuse symptom-specific factors. We found no support for a general liability factor to illicit substance misuse. Conclusions Results indicate that liability to misuse illicit substances is drug class specific, with little evidence for a general liability factor. Additionally, unique dimensions capturing propensity toward specific misuse symptoms (e.g., tolerance, withdrawal) across substances were identified. While this finding requires independent replication, the possibility of symptom-specific misuse factors, present in multiple substances, raises the prospect of genetic, neurobiological and behavioral predispositions toward distinct, narrowly defined features of drug abuse and dependence. PMID:26517709

  18. What Is Your Aquatics Liability IQ?

    ERIC Educational Resources Information Center

    Johnson, Ralph L.

    1984-01-01

    The author presents three court case studies and questions related to the cases, so that aquatic facility owners can test their liability perception. Recommendations are made in seven areas as defenses against aquatic liability. (JMK)

  19. Policy statement—Professional liability insurance and medicolegal education for pediatric residents and fellows.

    PubMed

    Gonzalez, Jose Luis

    2011-09-01

    The American Academy of Pediatrics believes that pediatric residents and fellows should be fully informed of the scope and limitations of their professional liability insurance coverage while in training. The academy states that residents and fellows should be educated by their training institutions on matters relating to medical liability and the importance of maintaining adequate and continuous professional liability insurance coverage throughout their careers in medicine.

  20. Genetic evaluation of mastitis liability and recovery through longitudinal analysis of transition probabilities

    PubMed Central

    2012-01-01

    Background Many methods for the genetic analysis of mastitis use a cross-sectional approach, which omits information on, e.g., repeated mastitis cases during lactation, somatic cell count fluctuations, and recovery process. Acknowledging the dynamic behavior of mastitis during lactation and taking into account that there is more than one binary response variable to consider, can enhance the genetic evaluation of mastitis. Methods Genetic evaluation of mastitis was carried out by modeling the dynamic nature of somatic cell count (SCC) within the lactation. The SCC patterns were captured by modeling transition probabilities between assumed states of mastitis and non-mastitis. A widely dispersed SCC pattern generates high transition probabilities between states and vice versa. This method can model transitions to and from states of infection simultaneously, i.e. both the mastitis liability and the recovery process are considered. A multilevel discrete time survival model was applied to estimate breeding values on simulated data with different dataset sizes, mastitis frequencies, and genetic correlations. Results Correlations between estimated and simulated breeding values showed that the estimated accuracies for mastitis liability were similar to those from previously tested methods that used data of confirmed mastitis cases, while our results were based on SCC as an indicator of mastitis. In addition, unlike the other methods, our method also generates breeding values for the recovery process. Conclusions The developed method provides an effective tool for the genetic evaluation of mastitis when considering the whole disease course and will contribute to improving the genetic evaluation of udder health. PMID:22475575

  1. Perspective on a Modified Developmental and Reproductive Toxicity Testing Strategy for Cancer Immunotherapy.

    PubMed

    Prell, Rodney A; Halpern, Wendy G; Rao, Gautham K

    2016-05-01

    The intent of cancer immunotherapy (CIT) is to generate and enhance T-cell responses against tumors. The tumor microenvironment establishes several inhibitory pathways that lead to suppression of the local immune response, which is permissive for tumor growth. The efficacy of different CITs, alone and in combination, stems from reinvigorating the tumor immune response via several mechanisms, including costimulatory agonists, checkpoint inhibitors, and vaccines. However, immune responses to other antigens (self and foreign) may also be enhanced, resulting in potentially undesired effects. In outbred mammalian pregnancies, the fetus expresses paternally derived alloantigens that are recognized as foreign by the maternal immune system. If unchecked or enhanced, maternal immunity to these alloantigens represents a developmental and reproductive risk and thus is a general liability for cancer immunotherapeutic molecules. We propose a tiered approach to confirm this mechanistic reproductive liability for CIT molecules. A rodent allopregnancy model is based on breeding 2 different strains of mice so that paternally derived alloantigens are expressed by the fetus. When tested with a cross-reactive biotherapeutic, small molecule drug, or surrogate molecule, this model should reveal on-target reproductive liabilities if the pathway is involved in maintaining pregnancy. Alternatively, allopregnancy models with genetically modified mice can be interrogated for exquisitely specific biotherapeutics with restricted species reactivity. The allopregnancy model represents a relatively straightforward approach to confirm an expected on-target reproductive risk for CIT molecules. For biotherapeutics, it could potentially replace more complex developmental and reproductive toxicity testing in nonhuman primates when a pregnancy hazard is confirmed or expected. © The Author(s) 2016.

  2. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., including vehicle simulations using industry standard model (need to add name and location of this open source model) to show projected fuel economy; (d) A detailed estimate of the total project costs together..., equity, and debt, and the liability of parties associated with the project; (f) Applicant's business plan...

  3. Principles of laboratory assessment of drug abuse liability and implications for clinical development

    PubMed Central

    Carter, Lawrence P.; Griffiths, Roland R.

    2009-01-01

    Abuse liability testing plays an important role in informing drug development, regulatory processes, and clinical practice. This paper describes the current “gold standard” methodologies that are used for laboratory assessments of abuse liability in non-human and human subjects. Particular emphasis is given to procedures such as non-human drug discrimination, self-administration, and physical dependence testing, and human dose effect abuse liability studies that are commonly used in regulatory submissions to governmental agencies. The potential benefits and risks associated with the inclusion of measures of abuse liability in industry-sponsored clinical trials is discussed. Lastly, it is noted that many factors contribute to patterns of drug abuse and dependence outside of the laboratory setting and positive or negative signals in abuse liability studies do not always translate to high or low levels of actual abuse or dependence. Well-designed patient and physician education, pharmacovigilance, and postmarketing surveillance can reduce the diversion and misuse of drugs with abuse liability and can effectively foster the protection and promotion of public health. PMID:19443137

  4. Subjective effects of slow-release bupropion versus caffeine as determined in a quasi-naturalistic setting.

    PubMed

    Zernig, Gerald; De Wit, Harriet; Telser, Stefan; Nienhusmeier, Matthias; Wakonigg, Gudrun; Sturm, Katja; Berger, Iris; Kemmler, Georg; Saria, Alois

    2004-04-01

    Bupropion (BUP), which in its slow-release formulation (Zyban) is used as a smoking-cessation drug, increases dopamine overflow in the nucleus accumbens and serves as a reinforcer in animal experiments, both suggesting that BUP may possess some abuse liability. The present study examined if BUP produced subjective effects indicative of abuse liability in a quasi-naturalistic setting, with caffeine (CAF) serving as a positive control. In a randomized double-blind crossover design, male smokers (n = 50) ingested two doses (interdosing interval, 6 h) of placebo (PLC), 178 mg CAF, or 150 mg slow-release BUP in their normal mid-week work environment. They completed questionnaires administered by telephone at regular intervals. CAF significantly increased ratings of 'pleasant effects' (p = 0.008) and 'high' (p = 0.03), whereas BUP produced a 'high' of only very moderate size (p = 0.02). In 3 subjects each, BUP or CAF produced ratings of 'pleasant effects' that were >9-fold higher than those for PLC. Finally, BUP increased the number of cigarettes smoked by 29% (i.e., from 24 to 31 per day; p = 0.004) only in those subjects who were unable to report any effect of either BUP or CAF. CAF had no effect on cigarette consumption. These findings suggest that BUP, like CAF, might be of some abuse liability in a small subgroup of smokers (i.e., 6% each of the present sample), and it may transiently increase, rather than decrease, smoking during early phases of treatment in continuing smokers. Copyright 2004 S. Karger AG, Basel

  5. Comparing Factor, Class, and Mixture Models of Cannabis Initiation and DSM Cannabis Use Disorder Criteria, Including Craving, in the Brisbane Longitudinal Twin Study

    PubMed Central

    Kubarych, Thomas S.; Kendler, Kenneth S.; Aggen, Steven H.; Estabrook, Ryne; Edwards, Alexis C.; Clark, Shaunna L.; Martin, Nicholas G.; Hickie, Ian B.; Neale, Michael C.; Gillespie, Nathan A.

    2014-01-01

    Accumulating evidence suggests that the Diagnostic and Statistical Manual of Mental Disorders (DSM) diagnostic criteria for cannabis abuse and dependence are best represented by a single underlying factor. However, it remains possible that models with additional factors, or latent class models or hybrid models, may better explain the data. Using structured interviews, 626 adult male and female twins provided complete data on symptoms of cannabis abuse and dependence, plus a craving criterion. We compared latent factor analysis, latent class analysis, and factor mixture modeling using normal theory marginal maximum likelihood for ordinal data. Our aim was to derive a parsimonious, best-fitting cannabis use disorder (CUD) phenotype based on DSM-IV criteria and determine whether DSM-5 craving loads onto a general factor. When compared with latent class and mixture models, factor models provided a better fit to the data. When conditioned on initiation and cannabis use, the association between criteria for abuse, dependence, withdrawal, and craving were best explained by two correlated latent factors for males and females: a general risk factor to CUD and a factor capturing the symptoms of social and occupational impairment as a consequence of frequent use. Secondary analyses revealed a modest increase in the prevalence of DSM-5 CUD compared with DSM-IV cannabis abuse or dependence. It is concluded that, in addition to a general factor with loadings on cannabis use and symptoms of abuse, dependence, withdrawal, and craving, a second clinically relevant factor defined by features of social and occupational impairment was also found for frequent cannabis use. PMID:24588857

  6. Beyond the standard of care: a new model to judge medical negligence.

    PubMed

    Brenner, Lawrence H; Brenner, Alison Tytell; Awerbuch, Eric J; Horwitz, Daniel

    2012-05-01

    The term "standard of care" has been used in law and medicine to determine whether medical care is negligent. However, the precise meaning of this concept is often unclear for both medical and legal professionals. Our purposes are to (1) examine the limitations of using standard of care as a measure of negligence, (2) propose the use of the legal concepts of justification and excuse in developing a new model of examining medical conduct, and (3) outline the framework of this model. We applied the principles of tort liability set forth in the clinical and legal literature to describe the difficulty in applying standard of care in medical negligence cases. Using the concepts of justification and excuse, we propose a judicial model that may promote fair and just jury verdicts in medical negligence cases. Contrary to conventional understanding, medical negligence is not simply nonconformity to norms. Two additional concepts of legal liability, ie, justification and excuse, must also be considered to properly judge medical conduct. Medical conduct is justified when the benefits outweigh the risks; the law sanctions the conduct and encourages future conduct under similar circumstances. Excuse, on the other hand, relieves a doctor of legal liability under specific circumstances even though his/her conduct was not justified. Standard of care is an inaccurate measure of medical negligence because it is premised on the faulty notion of conformity to norms. An alternative judicial model to determine medical negligence would (1) eliminate standard of care in medical malpractice law, (2) reframe the court instruction to jurors, and (3) establish an ongoing consensus committee on orthopaedic principles of negligence.

  7. Securitization product design for China's environmental pollution liability insurance.

    PubMed

    Pu, Chengyi; Addai, Bismark; Pan, Xiaojun; Bo, Pangtuo

    2017-02-01

    The environmental catastrophic accidents in China over the last three decades have triggered implementation of myriad policies by the government to help abate environmental pollution in the country. Consequently, research into environmental pollution liability insurance and how that can stimulate economic growth and the development of financial market in China is worthwhile. This study attempts to design a financial derivative for China's environmental pollution liability insurance to offer strong financial support for significant compensation towards potential catastrophic environmental loss exposures, especially losses from the chemical industry. Assuming the risk-free interest rate is 4%, the market portfolio expected return is 12%; the financial asset beta coefficient is 0.5, by using the capital asset pricing model (CAPM) and cash flow analysis; the principal risk bond yields 9.4%, single-period and two-period prices are 103.85 and 111.58, respectively; the principal partial-risk bond yields 10.09%, single-period and two-period prices are 103.85 and 111.58, respectively; and the principal risk-free bond yields 8.94%, single-period and two-period prices are 107.99 and 115.83, respectively. This loss exposure transfer framework transfers the catastrophic risks of environmental pollution from the traditional insurance and reinsurance markets to the capital market. This strengthens the underwriting capacity of environmental pollution liability insurance companies, mitigates the compensation risks of insurers and reinsurers, and provides a new channel to transfer the risks of environmental pollution.

  8. The evaluation of the abuse liability of drugs.

    PubMed

    Johanson, C E

    1990-01-01

    In order to place appropriate restrictions upon the availability of certain therapeutic agents to limit their abuse, it is important to assess abuse liability, an important aspect of drug safety evaluation. However, the negative consequences of restriction must also be considered. Drugs most likely to be tested are psychoactive compounds with therapeutic indications similar to known drugs of abuse. Methods include assays of pharmacological profile, drug discrimination procedures, self-administration procedures, and measures of drug-induced toxicity including evaluations of tolerance and physical dependence. Furthermore, the evaluation of toxicity using behavioural end-points is an important component of the assessment, and it is generally believed that the most valid procedure in this evaluation is the measurement of drug self-administration. However, even this method rarely predicts the extent of abuse of a specific drug. Although methods are available which appear to measure relative abuse liability, these procedures are not validated for all drug classes. Thus, additional strategies, including abuse liability studies in humans, modelled after those used with animals, must be used in order to make a more informed prediction. Although there is pressure to place restrictions on new drugs at the time of marketing, in light of the difficulty of predicting relative abuse potential, a better strategy might be to market a drug without restrictions, but require postmarketing surveillance in order to obtain more accurate information on which to base a final decision.

  9. 40 CFR 85.535 - Liability, recordkeeping, and end of year reporting.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., if we learn that your actions fall short of full compliance with applicable requirements we may... calendar year intermediate age conversions, outside useful life conversions, and the same conversion model...

  10. Liability for Off-Campus Injuries.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.; Gluckman, Ivan B.

    1984-01-01

    Liability in cases involving students injured off school property generally hinges on whether districts fail to exercise due care in supervising students while on school premises. Typical activities that may occasion liability for negligence and possible defenses are listed. (MJL)

  11. 48 CFR 1852.228-80 - Insurance-Immunity From Tort Liability.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., Insurance—Liability to Third Persons, and the associated NFS clause 1852.228-81, Insurance—Partial Immunity... clause at NFS 1852.228-82 Insurance—Total Immunity From Tort Liability, will be included in the contract...

  12. 48 CFR 1852.228-80 - Insurance-Immunity From Tort Liability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., Insurance—Liability to Third Persons, and the associated NFS clause 1852.228-81, Insurance—Partial Immunity... clause at NFS 1852.228-82 Insurance—Total Immunity From Tort Liability, will be included in the contract...

  13. 48 CFR 1852.228-80 - Insurance-Immunity From Tort Liability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., Insurance—Liability to Third Persons, and the associated NFS clause 1852.228-81, Insurance—Partial Immunity... clause at NFS 1852.228-82 Insurance—Total Immunity From Tort Liability, will be included in the contract...

  14. 48 CFR 1852.228-80 - Insurance-Immunity From Tort Liability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., Insurance—Liability to Third Persons, and the associated NFS clause 1852.228-81, Insurance—Partial Immunity... clause at NFS 1852.228-82 Insurance—Total Immunity From Tort Liability, will be included in the contract...

  15. 48 CFR 1852.228-80 - Insurance-Immunity From Tort Liability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., Insurance—Liability to Third Persons, and the associated NFS clause 1852.228-81, Insurance—Partial Immunity... clause at NFS 1852.228-82 Insurance—Total Immunity From Tort Liability, will be included in the contract...

  16. Liability and Insurance for Suborbital Flights

    NASA Astrophysics Data System (ADS)

    Masson-Zwaan, T.

    2012-01-01

    This paper analyzes and compares liability and liability insurance in the fields of aviation and spaceflight in order to propose solutions for a liability regime and insurance options for suborbital flights. Suborbital flights can be said to take place in the grey zone between air and space, between air law and space law, as well as between aviation insurance and space insurance. In terms of liability, the paper discusses air law and space law provisions in the fields of second and third party liability for damage to passengers and 'innocent bystanders' respectively, touching upon international treaties, national law and EU law, and on insurance to cover those risks. Although the insurance market is currently not ready to provide tailor-made products for operators of suborbital flights, it is expected to adapt rapidly once such flights will become reality. A hybrid approach will provide the best solution in the medium term.

  17. Effective chemotherapy of heterogeneous and drug-resistant early colon cancers by intermittent dose schedules: a computer simulation study.

    PubMed

    Axelrod, David E; Vedula, Sudeepti; Obaniyi, James

    2017-05-01

    The effectiveness of cancer chemotherapy is limited by intra-tumor heterogeneity, the emergence of spontaneous and induced drug-resistant mutant subclones, and the maximum dose to which normal tissues can be exposed without adverse side effects. The goal of this project was to determine if intermittent schedules of the maximum dose that allows colon crypt maintenance could overcome these limitations, specifically by eliminating mixtures of drug-resistant mutants from heterogeneous early colon adenomas while maintaining colon crypt function. A computer model of cell dynamics in human colon crypts was calibrated with measurements of human biopsy specimens. The model allowed simulation of continuous and intermittent dose schedules of a cytotoxic chemotherapeutic drug, as well as the drug's effect on the elimination of mutant cells and the maintenance of crypt function. Colon crypts can tolerate a tenfold greater intermittent dose than constant dose. This allows elimination of a mixture of relatively drug-sensitive and drug-resistant mutant subclones from heterogeneous colon crypts. Mutants can be eliminated whether they arise spontaneously or are induced by the cytotoxic drug. An intermittent dose, at the maximum that allows colon crypt maintenance, can be effective in eliminating a heterogeneous mixture of mutant subclones before they fill the crypt and form an adenoma.

  18. [Use of the Six Sigma methodology for the preparation of parenteral nutrition mixtures].

    PubMed

    Silgado Bernal, M F; Basto Benítez, I; Ramírez García, G

    2014-04-01

    To use the tools of the Six Sigma methodology for the statistical control in the elaboration of parenteral nutrition mixtures at the critical checkpoint of specific density. Between August of 2010 and September of 2013, specific density analysis was performed to 100% of the samples, and the data were divided in two groups, adults and neonates. The percentage of acceptance, the trend graphs, and the sigma level were determined. A normality analysis was carried out by using the Shapiro Wilk test and the total percentage of mixtures within the specification limits was calculated. The specific density data between August of 2010 and September of 2013 comply with the normality test (W = 0.94) and show improvement in sigma level through time, reaching 6/6 in adults and 3.8/6 in neonates. 100% of the mixtures comply with the specification limits for adults and neonates, always within the control limits during the process. The improvement plans together with the Six Sigma methodology allow controlling the process, and warrant the agreement between the medical prescription and the content of the mixture. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  19. Risky Business - Questions To Ask Your Liability Insurance Agent and Attorney.

    ERIC Educational Resources Information Center

    Strickland, James

    2000-01-01

    Discusses the unique vulnerabilities of the child care industry related to liability insurance. Presents questions for beginning liability- and coverage-related dialogue between the caregiver or center director and the attorney and insurance agent. (KB)

  20. Maximum Likelihood and Minimum Distance Applied to Univariate Mixture Distributions.

    ERIC Educational Resources Information Center

    Wang, Yuh-Yin Wu; Schafer, William D.

    This Monte-Carlo study compared modified Newton (NW), expectation-maximization algorithm (EM), and minimum Cramer-von Mises distance (MD), used to estimate parameters of univariate mixtures of two components. Data sets were fixed at size 160 and manipulated by mean separation, variance ratio, component proportion, and non-normality. Results…

  1. Cyber crimes.

    PubMed

    Nuzback, Kara

    2014-07-01

    Since it began offering cyber liability coverage in December 2011, the Texas Medical Liability Trust has received more than 150 cyber liability claims, most of which involved breaches of electronic protected health information. TMLT's cyber liability insurance will protect practices financially should a breach occur. The insurance covers a breach notification to customers and business partners, expenses for legal counsel, information security and forensic data services, public relations support, call center and website support, credit monitoring, and identity theft restoration services.

  2. The Legal Doctrine on 'Limitation of Liability' in the Precedent Analysis on Plastic Surgery Medical Malpractice Lawsuits.

    PubMed

    Park, Bo Young; Pak, Ji-Hyun; Hong, Seung-Eun; Kang, So Ra

    2015-12-01

    This study intended to review the precedents on plastic surgery medical malpractice lawsuits in lower-court trials, classify the reasons of 'limitation of liability' by type, and suggest a standard in the acknowledgement of limitation of liability ratio. The 30 lower-court's rulings on the cases bearing the medical negligence of the defendants acknowledged the liability ratio of the defendants between 30% and 100%. Ten cases ruled that the defendants were wholly responsible for the negligence or malpractice, while 20 cases acknowledged the limitation of liability principle. In the determination of damage compensation amount, the court considered the cause of the victim side, which contributed in the occurrence of the damage. The court also believed that it is against the idea of fairness to have the assailant pay the whole compensation, even there is no victim-side cause such as previous illness or physical constitution of the patient, and applies the legal doctrine on limitation of liability, which is an independent damage compensation adjustment system. Most of the rulings also limited the ratio of responsibility to certain extent. When considering that the legal doctrine on limitation of liability which supports concrete validity for the fair sharing of damage, the tangible classification of causes of limitation of liability suggested in this study would be a useful tool in forecasting the ruling of a plastic surgery medical malpractice lawsuit.

  3. The Legal Doctrine on 'Limitation of Liability' in the Precedent Analysis on Plastic Surgery Medical Malpractice Lawsuits

    PubMed Central

    Kang, So Ra

    2015-01-01

    This study intended to review the precedents on plastic surgery medical malpractice lawsuits in lower-court trials, classify the reasons of 'limitation of liability' by type, and suggest a standard in the acknowledgement of limitation of liability ratio. The 30 lower-court's rulings on the cases bearing the medical negligence of the defendants acknowledged the liability ratio of the defendants between 30% and 100%. Ten cases ruled that the defendants were wholly responsible for the negligence or malpractice, while 20 cases acknowledged the limitation of liability principle. In the determination of damage compensation amount, the court considered the cause of the victim side, which contributed in the occurrence of the damage. The court also believed that it is against the idea of fairness to have the assailant pay the whole compensation, even there is no victim-side cause such as previous illness or physical constitution of the patient, and applies the legal doctrine on limitation of liability, which is an independent damage compensation adjustment system. Most of the rulings also limited the ratio of responsibility to certain extent. When considering that the legal doctrine on limitation of liability which supports concrete validity for the fair sharing of damage, the tangible classification of causes of limitation of liability suggested in this study would be a useful tool in forecasting the ruling of a plastic surgery medical malpractice lawsuit. PMID:26713045

  4. Poisson Mixture Regression Models for Heart Disease Prediction.

    PubMed

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  5. Poisson Mixture Regression Models for Heart Disease Prediction

    PubMed Central

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  6. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  7. Early Menarche and Gestational Diabetes Mellitus at First Live Birth.

    PubMed

    Shen, Yun; Hu, Hui; D Taylor, Brandie; Kan, Haidong; Xu, Xiaohui

    2017-03-01

    To examine the association between early menarche and gestational diabetes mellitus (GDM). Data from the National Health and Nutrition Examination Survey 2007-2012 were used to investigate the association between age at menarche and the risk of GDM at first birth among 5914 women. A growth mixture model was used to detect distinctive menarche onset patterns based on self-reported age at menarche. Logistic regression models were then used to examine the associations between menarche initiation patterns and GDM after adjusting for sociodemographic factors, family history of diabetes mellitus, lifetime greatest Body Mass Index, smoking status, and physical activity level. Among the 5914 first-time mothers, 3.4 % had self-reported GDM. We detected three groups with heterogeneous menarche onset patterns, the Early, Normal, and Late Menarche Groups. The regression model shows that compared to the Normal Menarche Group, the Early Menarche Group had 1.75 (95 % CI 1.10, 2.79) times the odds of having GDM. No statistically significant difference was observed between the Normal and the Late Menarche Group. This study suggests that early menarche may be a risk factor of GDM. Future studies are warranted to examine and confirm this finding.

  8. 29 CFR 2590.732 - Special rules relating to group health plans.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for accident (including accidental death and dismemberment); (ii) Disability income coverage; (iii) Liability insurance, including general liability insurance and automobile liability insurance; (iv) Coverage...) Automobile medical payment insurance; (vii) Credit-only insurance (for example, mortgage insurance); and...

  9. 29 CFR 2590.732 - Special rules relating to group health plans.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... for accident (including accidental death and dismemberment); (ii) Disability income coverage; (iii) Liability insurance, including general liability insurance and automobile liability insurance; (iv) Coverage...) Automobile medical payment insurance; (vii) Credit-only insurance (for example, mortgage insurance); and...

  10. 29 CFR 2590.732 - Special rules relating to group health plans.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for accident (including accidental death and dismemberment); (ii) Disability income coverage; (iii) Liability insurance, including general liability insurance and automobile liability insurance; (iv) Coverage...) Automobile medical payment insurance; (vii) Credit-only insurance (for example, mortgage insurance); and...

  11. College and University Liability under Superfund.

    ERIC Educational Resources Information Center

    Manderfeld, Donald J.

    1988-01-01

    A discussion of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 concerning responsibility for toxic waste disposal looks at college and university liability under the act, institutional defenses that could be raised under the act, and the settlement process. (MSE)

  12. New approach application of data transformation in mean centering of ratio spectra method

    NASA Astrophysics Data System (ADS)

    Issa, Mahmoud M.; Nejem, R.'afat M.; Van Staden, Raluca Ioana Stefan; Aboul-Enein, Hassan Y.

    2015-05-01

    Most of mean centering (MCR) methods are designed to be used with data sets whose values have a normal or nearly normal distribution. The errors associated with the values are also assumed to be independent and random. If the data are skewed, the results obtained may be doubtful. Most of the time, it was assumed a normal distribution and if a confidence interval includes a negative value, it was cut off at zero. However, it is possible to transform the data so that at least an approximately normal distribution is attained. Taking the logarithm of each data point is one transformation frequently used. As a result, the geometric mean is deliberated a better measure of central tendency than the arithmetic mean. The developed MCR method using the geometric mean has been successfully applied to the analysis of a ternary mixture of aspirin (ASP), atorvastatin (ATOR) and clopidogrel (CLOP) as a model. The results obtained were statistically compared with reported HPLC method.

  13. Temporal and spatial patterns in vegetation and atmospheric properties from AVIRIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, D.A.; Green, R.O.; Adams, J.B.

    1997-12-01

    Little research has focused on the use of imaging spectrometry for change detection. In this paper, the authors apply Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data to the monitoring of seasonal changes in atmospheric water vapor, liquid water, and surface cover in the vicinity of the Jasper Ridge, CA, for three dates in 1992. Apparent surface reflectance was retrieved and water vapor and liquid water mapped by using a radiative-transfer-based inversion that accounts for spatially variable atmospheres. Spectral mixture analysis (SMA) was used to model reflectance data as mixtures of green vegetation (GV), nonphotosynthetic vegetation (NPV), soil, and shade. Temporal andmore » spatial patterns in endmember fractions and liquid water were compared to the normalized difference vegetation index (NDVI). The reflectance retrieval algorithm was tested by using a temporally invariant target.« less

  14. A hybrid expectation maximisation and MCMC sampling algorithm to implement Bayesian mixture model based genomic prediction and QTL mapping.

    PubMed

    Wang, Tingting; Chen, Yi-Ping Phoebe; Bowman, Phil J; Goddard, Michael E; Hayes, Ben J

    2016-09-21

    Bayesian mixture models in which the effects of SNP are assumed to come from normal distributions with different variances are attractive for simultaneous genomic prediction and QTL mapping. These models are usually implemented with Monte Carlo Markov Chain (MCMC) sampling, which requires long compute times with large genomic data sets. Here, we present an efficient approach (termed HyB_BR), which is a hybrid of an Expectation-Maximisation algorithm, followed by a limited number of MCMC without the requirement for burn-in. To test prediction accuracy from HyB_BR, dairy cattle and human disease trait data were used. In the dairy cattle data, there were four quantitative traits (milk volume, protein kg, fat% in milk and fertility) measured in 16,214 cattle from two breeds genotyped for 632,002 SNPs. Validation of genomic predictions was in a subset of cattle either from the reference set or in animals from a third breeds that were not in the reference set. In all cases, HyB_BR gave almost identical accuracies to Bayesian mixture models implemented with full MCMC, however computational time was reduced by up to 1/17 of that required by full MCMC. The SNPs with high posterior probability of a non-zero effect were also very similar between full MCMC and HyB_BR, with several known genes affecting milk production in this category, as well as some novel genes. HyB_BR was also applied to seven human diseases with 4890 individuals genotyped for around 300 K SNPs in a case/control design, from the Welcome Trust Case Control Consortium (WTCCC). In this data set, the results demonstrated again that HyB_BR performed as well as Bayesian mixture models with full MCMC for genomic predictions and genetic architecture inference while reducing the computational time from 45 h with full MCMC to 3 h with HyB_BR. The results for quantitative traits in cattle and disease in humans demonstrate that HyB_BR can perform equally well as Bayesian mixture models implemented with full MCMC in terms of prediction accuracy, but with up to 17 times faster than the full MCMC implementations. The HyB_BR algorithm makes simultaneous genomic prediction, QTL mapping and inference of genetic architecture feasible in large genomic data sets.

  15. The effect of binary mixtures of zinc, copper, cadmium, and nickel on the growth of the freshwater diatom Navicula pelliculosa and comparison with mixture toxicity model predictions.

    PubMed

    Nagai, Takashi; De Schamphelaere, Karel A C

    2016-11-01

    The authors investigated the effect of binary mixtures of zinc (Zn), copper (Cu), cadmium (Cd), and nickel (Ni) on the growth of a freshwater diatom, Navicula pelliculosa. A 7 × 7 full factorial experimental design (49 combinations in total) was used to test each binary metal mixture. A 3-d fluorescence microplate toxicity assay was used to test each combination. Mixture effects were predicted by concentration addition and independent action models based on a single-metal concentration-response relationship between the relative growth rate and the calculated free metal ion activity. Although the concentration addition model predicted the observed mixture toxicity significantly better than the independent action model for the Zn-Cu mixture, the independent action model predicted the observed mixture toxicity significantly better than the concentration addition model for the Cd-Zn, Cd-Ni, and Cd-Cu mixtures. For the Zn-Ni and Cu-Ni mixtures, it was unclear which of the 2 models was better. Statistical analysis concerning antagonistic/synergistic interactions showed that the concentration addition model is generally conservative (with the Zn-Ni mixture being the sole exception), indicating that the concentration addition model would be useful as a method for a conservative first-tier screening-level risk analysis of metal mixtures. Environ Toxicol Chem 2016;35:2765-2773. © 2016 SETAC. © 2016 SETAC.

  16. Mixture Rasch Models with Joint Maximum Likelihood Estimation

    ERIC Educational Resources Information Center

    Willse, John T.

    2011-01-01

    This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum likelihood is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…

  17. 45 CFR 146.145 - Special rules relating to group health plans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... circumstances. The following benefits are excepted in all circumstances— (i) Coverage only for accident..., including general liability insurance and automobile liability insurance; (iv) Coverage issued as a supplement to liability insurance; (v) Workers' compensation or similar coverage; (vi) Automobile medical...

  18. 45 CFR 146.145 - Special rules relating to group health plans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... circumstances. The following benefits are excepted in all circumstances— (i) Coverage only for accident..., including general liability insurance and automobile liability insurance; (iv) Coverage issued as a supplement to liability insurance; (v) Workers' compensation or similar coverage; (vi) Automobile medical...

  19. 45 CFR 146.145 - Special rules relating to group health plans.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... circumstances. The following benefits are excepted in all circumstances— (i) Coverage only for accident..., including general liability insurance and automobile liability insurance; (iv) Coverage issued as a supplement to liability insurance; (v) Workers' compensation or similar coverage; (vi) Automobile medical...

  20. Defining a Road Safety Audits Program for Enhancing Safety and Reducing Tort Liability

    DOT National Transportation Integrated Search

    2000-07-01

    Table of Contents: (1) Introduction; (2) Review of Safety Issues; (3) Review of Legal Liability Issues; (4) Summary of Safety and Legal Liability Issues. Prepared in cooperation with Wyoming Univ., Laramie. Dept. of Civil and Architectural Engineerin...

  1. Liability for Student Workers.

    ERIC Educational Resources Information Center

    Tryon, Jonathan S.

    1994-01-01

    Examines liability issues for academic libraries=FE student workers. Discussion includes staff training; hiring practices; supervision; negligence; emergency procedures; the use of reasonable care; and knowledge of library rules. Specific nonlibrary liability cases are cited as examples of the importance of employee screening, training, and danger…

  2. 48 CFR 1426.7103 - The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization... Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization...

  3. 48 CFR 1426.7103 - The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization... Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization...

  4. 48 CFR 1426.7103 - The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization... Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization...

  5. 48 CFR 1426.7103 - The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization... Environmental Response, Compensation, and Liability Act (CERCLA) (Superfund Minority Contractors Utilization...

  6. Negligent Liability of College Counselors

    ERIC Educational Resources Information Center

    Sampson, James P., Jr.

    1977-01-01

    Discusses college counselors' liability for negligence in the performance of their duties, citing the only two court decisions involving college counselor negligence. Summarizes the circumstances in which college counselors may or may not be held negligent, and identifies present and future trends in counselor liability. (JG)

  7. Data Supporting the Environmental Liability Reported on the FY 2000 Financial Statements

    DTIC Science & Technology

    2001-08-10

    consolidated financial statements . This audit supports our audit of the FY 2000 DoD Agency-Wide Financial Statements, specifically the line item for environmental liabilities. The Army, the Navy, and the General Accounting Office also issued audit reports related to the reliability, completeness, and supportability of environmental liabilities for FY 2000. Environmental liabilities included estimated amounts for future cleanup of contamination resulting from waste disposal methods, leaks, spills, and other past activity which have created a public health or

  8. Mathematical modeling of erythrocyte chimerism informs genetic intervention strategies for sickle cell disease.

    PubMed

    Altrock, Philipp M; Brendel, Christian; Renella, Raffaele; Orkin, Stuart H; Williams, David A; Michor, Franziska

    2016-09-01

    Recent advances in gene therapy and genome-engineering technologies offer the opportunity to correct sickle cell disease (SCD), a heritable disorder caused by a point mutation in the β-globin gene. The developmental switch from fetal γ-globin to adult β-globin is governed in part by the transcription factor (TF) BCL11A. This TF has been proposed as a therapeutic target for reactivation of γ-globin and concomitant reduction of β-sickle globin. In this and other approaches, genetic alteration of a portion of the hematopoietic stem cell (HSC) compartment leads to a mixture of sickling and corrected red blood cells (RBCs) in periphery. To reverse the sickling phenotype, a certain proportion of corrected RBCs is necessary; the degree of HSC alteration required to achieve a desired fraction of corrected RBCs remains unknown. To address this issue, we developed a mathematical model describing aging and survival of sickle-susceptible and normal RBCs; the former can have a selective survival advantage leading to their overrepresentation. We identified the level of bone marrow chimerism required for successful stem cell-based gene therapies in SCD. Our findings were further informed using an experimental mouse model, where we transplanted mixtures of Berkeley SCD and normal murine bone marrow cells to establish chimeric grafts in murine hosts. Our integrative theoretical and experimental approach identifies the target frequency of HSC alterations required for effective treatment of sickling syndromes in humans. Our work replaces episodic observations of such target frequencies with a mathematical modeling framework that covers a large and continuous spectrum of chimerism conditions. Am. J. Hematol. 91:931-937, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Effect of roll compaction on granule size distribution of microcrystalline cellulose–mannitol mixtures: computational intelligence modeling and parametric analysis

    PubMed Central

    Kazemi, Pezhman; Khalid, Mohammad Hassan; Pérez Gago, Ana; Kleinebudde, Peter; Jachowicz, Renata; Szlęk, Jakub; Mendyk, Aleksander

    2017-01-01

    Dry granulation using roll compaction is a typical unit operation for producing solid dosage forms in the pharmaceutical industry. Dry granulation is commonly used if the powder mixture is sensitive to heat and moisture and has poor flow properties. The output of roll compaction is compacted ribbons that exhibit different properties based on the adjusted process parameters. These ribbons are then milled into granules and finally compressed into tablets. The properties of the ribbons directly affect the granule size distribution (GSD) and the quality of final products; thus, it is imperative to study the effect of roll compaction process parameters on GSD. The understanding of how the roll compactor process parameters and material properties interact with each other will allow accurate control of the process, leading to the implementation of quality by design practices. Computational intelligence (CI) methods have a great potential for being used within the scope of quality by design approach. The main objective of this study was to show how the computational intelligence techniques can be useful to predict the GSD by using different process conditions of roll compaction and material properties. Different techniques such as multiple linear regression, artificial neural networks, random forest, Cubist and k-nearest neighbors algorithm assisted by sevenfold cross-validation were used to present generalized models for the prediction of GSD based on roll compaction process setting and material properties. The normalized root-mean-squared error and the coefficient of determination (R2) were used for model assessment. The best fit was obtained by Cubist model (normalized root-mean-squared error =3.22%, R2=0.95). Based on the results, it was confirmed that the material properties (true density) followed by compaction force have the most significant effect on GSD. PMID:28176905

  10. Effect of roll compaction on granule size distribution of microcrystalline cellulose-mannitol mixtures: computational intelligence modeling and parametric analysis.

    PubMed

    Kazemi, Pezhman; Khalid, Mohammad Hassan; Pérez Gago, Ana; Kleinebudde, Peter; Jachowicz, Renata; Szlęk, Jakub; Mendyk, Aleksander

    2017-01-01

    Dry granulation using roll compaction is a typical unit operation for producing solid dosage forms in the pharmaceutical industry. Dry granulation is commonly used if the powder mixture is sensitive to heat and moisture and has poor flow properties. The output of roll compaction is compacted ribbons that exhibit different properties based on the adjusted process parameters. These ribbons are then milled into granules and finally compressed into tablets. The properties of the ribbons directly affect the granule size distribution (GSD) and the quality of final products; thus, it is imperative to study the effect of roll compaction process parameters on GSD. The understanding of how the roll compactor process parameters and material properties interact with each other will allow accurate control of the process, leading to the implementation of quality by design practices. Computational intelligence (CI) methods have a great potential for being used within the scope of quality by design approach. The main objective of this study was to show how the computational intelligence techniques can be useful to predict the GSD by using different process conditions of roll compaction and material properties. Different techniques such as multiple linear regression, artificial neural networks, random forest, Cubist and k-nearest neighbors algorithm assisted by sevenfold cross-validation were used to present generalized models for the prediction of GSD based on roll compaction process setting and material properties. The normalized root-mean-squared error and the coefficient of determination ( R 2 ) were used for model assessment. The best fit was obtained by Cubist model (normalized root-mean-squared error =3.22%, R 2 =0.95). Based on the results, it was confirmed that the material properties (true density) followed by compaction force have the most significant effect on GSD.

  11. Growth mixture modelling in families of the Framingham Heart Study

    PubMed Central

    2009-01-01

    Growth mixture modelling, a less explored method in genetic research, addresses unobserved heterogeneity in population samples. We applied this technique to longitudinal data of the Framingham Heart Study. We examined systolic blood pressure (BP) measures in 1060 males from 692 families and detected three subclasses, which varied significantly in their developmental trajectories over time. The first class consisted of 60 high-risk individuals with elevated BP early in life and a steep increase over time. The second group of 131 individuals displayed first normal BP, but showed a significant increase over time and reached high BP values late in their life time. The largest group of 869 individuals could be considered a normative group with normal BP on all exams. To identify genetic modulators for this phenotype, we tested 2,340 single-nucleotide polymorphisms on chromosome 8 for association with the class membership probabilities of our model. The probability of being in Class 1 was significantly associated with a very rare variant (rs1445404) present in only four individuals from four different families located in the coding region of the gene EYA (eyes absent homolog 1 in Drosophila) (p = 1.39 × 10-13). Mutations in EYA are known to cause brachio-oto-renal syndrome, as well as isolated renal malformations. Renal malformations could cause high BP early in life. This result awaits replication; however, it suggests that analyzing genetic data stratified for high-risk subgroups defined by a unique development over time could be useful for the detection of rare mutations in common multi-factorial diseases. PMID:20017979

  12. Signal Partitioning Algorithm for Highly Efficient Gaussian Mixture Modeling in Mass Spectrometry

    PubMed Central

    Polanski, Andrzej; Marczyk, Michal; Pietrowska, Monika; Widlak, Piotr; Polanska, Joanna

    2015-01-01

    Mixture - modeling of mass spectra is an approach with many potential applications including peak detection and quantification, smoothing, de-noising, feature extraction and spectral signal compression. However, existing algorithms do not allow for automated analyses of whole spectra. Therefore, despite highlighting potential advantages of mixture modeling of mass spectra of peptide/protein mixtures and some preliminary results presented in several papers, the mixture modeling approach was so far not developed to the stage enabling systematic comparisons with existing software packages for proteomic mass spectra analyses. In this paper we present an efficient algorithm for Gaussian mixture modeling of proteomic mass spectra of different types (e.g., MALDI-ToF profiling, MALDI-IMS). The main idea is automated partitioning of protein mass spectral signal into fragments. The obtained fragments are separately decomposed into Gaussian mixture models. The parameters of the mixture models of fragments are then aggregated to form the mixture model of the whole spectrum. We compare the elaborated algorithm to existing algorithms for peak detection and we demonstrate improvements of peak detection efficiency obtained by using Gaussian mixture modeling. We also show applications of the elaborated algorithm to real proteomic datasets of low and high resolution. PMID:26230717

  13. On the asymptotic improvement of supervised learning by utilizing additional unlabeled samples - Normal mixture density case

    NASA Technical Reports Server (NTRS)

    Shahshahani, Behzad M.; Landgrebe, David A.

    1992-01-01

    The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes. supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised-unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts.

  14. Effect of clay content and mineralogy on frictional sliding behavior of simulated gouges: binary and ternary mixtures of quartz, illite, and montmorillonite

    USGS Publications Warehouse

    Tembe, Sheryl; Lockner, David A.; Wong, Teng-Fong

    2010-01-01

    We investigated the frictional sliding behavior of simulated quartz-clay gouges under stress conditions relevant to seismogenic depths. Conventional triaxial compression tests were conducted at 40 MPa effective normal stress on saturated saw cut samples containing binary and ternary mixtures of quartz, montmorillonite, and illite. In all cases, frictional strengths of mixtures fall between the end-members of pure quartz (strongest) and clay (weakest). The overall trend was a decrease in strength with increasing clay content. In the illite/quartz mixture the trend was nearly linear, while in the montmorillonite mixtures a sigmoidal trend with three strength regimes was noted. Microstructural observations were performed on the deformed samples to characterize the geometric attributes of shear localization within the gouge layers. Two micromechanical models were used to analyze the critical clay fractions for the two-regime transitions on the basis of clay porosity and packing of the quartz grains. The transition from regime 1 (high strength) to 2 (intermediate strength) is associated with the shift from a stress-supporting framework of quartz grains to a clay matrix embedded with disperse quartz grains, manifested by the development of P-foliation and reduction in Riedel shear angle. The transition from regime 2 (intermediate strength) to 3 (low strength) is attributed to the development of shear localization in the clay matrix, occurring only when the neighboring layers of quartz grains are separated by a critical clay thickness. Our mixture data relating strength degradation to clay content agree well with strengths of natural shear zone materials obtained from scientific deep drilling projects.

  15. PMP22 related neuropathies: Charcot-Marie-Tooth disease type 1A and Hereditary Neuropathy with liability to Pressure Palsies.

    PubMed

    van Paassen, Barbara W; van der Kooi, Anneke J; van Spaendonck-Zwarts, Karin Y; Verhamme, Camiel; Baas, Frank; de Visser, Marianne

    2014-03-19

    PMP22 related neuropathies comprise (1) PMP22 duplications leading to Charcot-Marie-Tooth disease type 1A (CMT1A), (2) PMP22 deletions, leading to Hereditary Neuropathy with liability to Pressure Palsies (HNPP), and (3) PMP22 point mutations, causing both phenotypes. Overall prevalence of CMT is usually reported as 1:2,500, epidemiological studies show that 20-64% of CMT patients carry the PMP22 duplication. The prevalence of HNPP is not well known. CMT1A usually presents in the first two decades with difficulty walking or running. Distal symmetrical muscle weakness and wasting and sensory loss is present, legs more frequently and more severely affected than arms. HNPP typically leads to episodic, painless, recurrent, focal motor and sensory peripheral neuropathy, preceded by minor compression on the affected nerve. Electrophysiological evaluation is needed to determine whether the polyneuropathy is demyelinating. Sonography of the nerves can be useful. Diagnosis is confirmed by finding respectively a PMP22 duplication, deletion or point mutation. Differential diagnosis includes other inherited neuropathies, and acquired polyneuropathies. The mode of inheritance is autosomal dominant and de novo mutations occur. Offspring of patients have a chance of 50% to inherit the mutation from their affected parent. Prenatal testing is possible; requests for prenatal testing are not common. Treatment is currently symptomatic and may include management by a rehabilitation physician, physiotherapist, occupational therapist and orthopaedic surgeon. Adult CMT1A patients show slow clinical progression of disease, which seems to reflect a process of normal ageing. Life expectancy is normal.

  16. PMP22 related neuropathies: Charcot-Marie-Tooth disease type 1A and Hereditary Neuropathy with liability to Pressure Palsies

    PubMed Central

    2014-01-01

    PMP22 related neuropathies comprise (1) PMP22 duplications leading to Charcot-Marie-Tooth disease type 1A (CMT1A), (2) PMP22 deletions, leading to Hereditary Neuropathy with liability to Pressure Palsies (HNPP), and (3) PMP22 point mutations, causing both phenotypes. Overall prevalence of CMT is usually reported as 1:2,500, epidemiological studies show that 20-64% of CMT patients carry the PMP22 duplication. The prevalence of HNPP is not well known. CMT1A usually presents in the first two decades with difficulty walking or running. Distal symmetrical muscle weakness and wasting and sensory loss is present, legs more frequently and more severely affected than arms. HNPP typically leads to episodic, painless, recurrent, focal motor and sensory peripheral neuropathy, preceded by minor compression on the affected nerve. Electrophysiological evaluation is needed to determine whether the polyneuropathy is demyelinating. Sonography of the nerves can be useful. Diagnosis is confirmed by finding respectively a PMP22 duplication, deletion or point mutation. Differential diagnosis includes other inherited neuropathies, and acquired polyneuropathies. The mode of inheritance is autosomal dominant and de novo mutations occur. Offspring of patients have a chance of 50% to inherit the mutation from their affected parent. Prenatal testing is possible; requests for prenatal testing are not common. Treatment is currently symptomatic and may include management by a rehabilitation physician, physiotherapist, occupational therapist and orthopaedic surgeon. Adult CMT1A patients show slow clinical progression of disease, which seems to reflect a process of normal ageing. Life expectancy is normal. PMID:24646194

  17. Tort liability : a handbook for employees of the Virginia Department of Transportation and Virginia municipal corporations, June 2004.

    DOT National Transportation Integrated Search

    2004-01-01

    Court decisions concerning state government liability, as well as VDOT's continuing commitment to safety, have made tort liability an increasingly important concern. In response, the Virginia Transportation Research Council has published a tort liabi...

  18. New entity for conducting group practice offers new potential.

    PubMed

    Rich, H I

    1994-01-01

    A new form of entity, the limited liability company (LLC), may be used by physicians to conduct group practices with the tax advantages of a partnership and insulation from liability for copractitioner's acts. The author reviews the New Jersey Limited Liability Company Act.

  19. Liability.

    ERIC Educational Resources Information Center

    Hollander, Patricia A.

    This chapter on liability covers a number of cases alleging negligence by colleges, universities, and university hospitals filed by patients and injured students. Liability issues are also part of defamation of character suits. Oral statements are known as slander, while written statements which defame are called libel. In certain situations,…

  20. Media Defamation and the Free-Lance Writer.

    ERIC Educational Resources Information Center

    Stevens, George E.

    1987-01-01

    Discusses the responsibilities of publishers and freelance writers concerning the liability involved in defamatory statements. Reviews several court cases pertaining to publisher liability and claims that, if a writer is not under the immediate control or supervision of the publisher, the publisher may avoid liability. (MM)

  1. 48 CFR 228.311 - Solicitation provision and contract clause on liability insurance under cost-reimbursement...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Solicitation provision and contract clause on liability insurance under cost-reimbursement contracts. 228.311 Section 228.311 Federal... liability insurance under cost-reimbursement contracts. ...

  2. Federal government provision of third-party liability insurance to space vehicle users

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Support decisions concerning the provision by the Federal Government of third-party liability insurance for commercial space activities were studied. The practices associated with third-party liability insurance in the marine, aviation, and electric utility industries in addition to those industries associated with space missions were reviewed. Theoretical considerations of rate setting are discussed and a methodology to determine the period of time over which the insurers of each industry intend to set aside reserves to recover from a maximum liability loss should one occur is introduced. The data were analyzed to determine the setaside period in each industry, and to suggest reasonable standards from the insurer's point of view. Criteria for Federal provision of insurance are discussed, an interpretation of the Price-Anderson Act, determinants of the availability of commercial insurance, potential insurer liability, and measures of reasonableness for premium rates from the user's point of view are presented. Options available to the government regarding third part liability protection are presented.

  3. The impact of tort reform and quality improvements on medical liability claims: a tale of 2 States.

    PubMed

    Illingworth, Kenneth D; Shaha, Steven H; Tzeng, Tony H; Sinha, Michael S; Saleh, Khaled J

    2015-05-01

    The purpose of this study was to determine the effect of tort reform and quality improvement measures on medical liability claims in 2 groups of hospitals within the same multihospital organization: one in Texas, which implemented medical liability tort reform caps on noneconomic damages in 2003, and one in Louisiana, which did not undergo significant tort reform during the same time period. Significant reduction in medical liability claims per quarter in Texas was found after tort reform implementation (7.27 to 1.4; P<.05). A significant correlation was found between the increase in mean Centers for Medicare & Medicaid Services performance score and the decrease in the frequency of claims observed in Louisiana (P<.05). Although tort reform caps on noneconomic damages in Texas caused the largest initial decrease, increasing quality improvement measures without increasing financial burden also decreased liability claims in Louisiana. Uniquely, this study showed that increasing patient quality resulted in decreased medical liability claims. © 2014 by the American College of Medical Quality.

  4. 12 CFR Appendix C to Part 717 - Model Forms for Opt-Out Notices

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... Although use of the model forms is not required, use of the model forms in this appendix (as applicable... from liability afforded by use of the model forms. These changes may not be so extensive as to affect... score” for accuracy, such as “payment history,” “credit history,” “payoff status,” or “claims history...

  5. 12 CFR Appendix C to Part 334 - Model Forms for Opt-Out Notices

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... Although use of the model forms is not required, use of the model forms in this Appendix (as applicable... from liability afforded by use of the model forms. These changes may not be so extensive as to affect... score” for accuracy, such as “payment history,” “credit history,” “payoff status,” or “claims history...

  6. Analysis of Spin Financial Market by GARCH Model

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2013-08-01

    A spin model is used for simulations of financial markets. To determine return volatility in the spin financial market we use the GARCH model often used for volatility estimation in empirical finance. We apply the Bayesian inference performed by the Markov Chain Monte Carlo method to the parameter estimation of the GARCH model. It is found that volatility determined by the GARCH model exhibits "volatility clustering" also observed in the real financial markets. Using volatility determined by the GARCH model we examine the mixture-of-distribution hypothesis (MDH) suggested for the asset return dynamics. We find that the returns standardized by volatility are approximately standard normal random variables. Moreover we find that the absolute standardized returns show no significant autocorrelation. These findings are consistent with the view of the MDH for the return dynamics.

  7. Development of liability syndromes for schizophrenia: where did they come from and where are they going?

    PubMed

    Stone, William S; Giuliano, Anthony J

    2013-10-01

    Three decades after Paul Meehl proposed the term "schizotaxia" to describe a conceptual framework for understanding the liability to schizophrenia, Ming Tsuang et al. at Harvard University reformulated the concept as a clinical syndrome with provisional research criteria. The reformulated view relied heavily on more recent data showing that many non-psychotic, un-medicated biological relatives of individuals with schizophrenia showed difficulties in cognitive and other clinical functions that resembled those seen in their ill relatives. The reformulation raised questions about both whether and when liability could be assessed validly in the absence of psychosis, and about the extent to which symptoms of liability are reversible. Both questions bear on the larger issue of early intervention in schizophrenia. This article reviews the efforts of Tsuang et al. to conceptualize and validate schizotaxia as one such syndrome of liability. Towards this end, liability is considered first more generally as an outcome of interactive genetic and environmental factors. Liability is then considered in the context of endophenotypes as a concept that is both broader and is potentially more specific (and predictive) than many DSM or ICD diagnostic symptoms. Liability syndromes are then considered in the context of their proximity to illness, first by reviewing prodromal syndromes (which are more proximal), and then by considering schizotaxia, which, as it is currently formulated, is pre-prodromal and, therefore, less proximal. Finally, challenges to validation and future directions for research are considered. © 2013 Wiley Periodicals, Inc.

  8. Rapid and effective decontamination of chlorophenol-contaminated soil by sorption into commercial polymers: concept demonstration and process modeling.

    PubMed

    Tomei, M Concetta; Mosca Angelucci, Domenica; Ademollo, Nicoletta; Daugulis, Andrew J

    2015-03-01

    Solid phase extraction performed with commercial polymer beads to treat soil contaminated by chlorophenols (4-chlorophenol, 2,4-dichlorophenol and pentachlorophenol) as single compounds and in a mixture has been investigated in this study. Soil-water-polymer partition tests were conducted to determine the relative affinities of single compounds in soil-water and polymer-water pairs. Subsequent soil extraction tests were performed with Hytrel 8206, the polymer showing the highest affinity for the tested chlorophenols. Factors that were examined were polymer type, moisture content, and contamination level. Increased moisture content (up to 100%) improved the extraction efficiency for all three compounds. Extraction tests at this upper level of moisture content showed removal efficiencies ≥70% for all the compounds and their ternary mixture, for 24 h of contact time, which is in contrast to the weeks and months, normally required for conventional ex situ remediation processes. A dynamic model characterizing the rate and extent of decontamination was also formulated, calibrated and validated with the experimental data. The proposed model, based on the simplified approach of "lumped parameters" for the mass transfer coefficients, provided very good predictions of the experimental data for the absorptive removal of contaminants from soil at different individual solute levels. Parameters evaluated from calibration by fitting of single compound data, have been successfully applied to predict mixture data, with differences between experimental and predicted data in all cases being ≤3%. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Factors impacting hunter access to private lands in southeast Minnesota

    USGS Publications Warehouse

    Walberg, Eric; Cornicelli, Louis; Fulton, David C.

    2018-01-01

    White-tailed deer (Odocoileus virginianus) have important socioeconomic and ecological impacts in the United States. Hunting is considered to be important for the effective management of deer and relies on access to privately owned lands. In 2013, we surveyed nonindustrial private landowners in southeast Minnesota and created two logit models to examine factors that impact landowners’ decision to (a) allow public hunting access and (b) post private property. Parcel characteristics were found to impact landowner decisions to allow hunting access, particularly the size of the property and whether it was posted. Hunting access to small properties was more likely to be restricted to family, friends, and neighbors (83%) compared to medium (74%) or large properties (60%). Hunter concerns (e.g., liability) and knowledge about deer management was significant in both models, suggesting there are opportunities to educate landowners about the importance of allowing public hunting access and available liability protections.

  10. The difference between LSMC and replicating portfolio in insurance liability modeling.

    PubMed

    Pelsser, Antoon; Schweizer, Janina

    2016-01-01

    Solvency II requires insurers to calculate the 1-year value at risk of their balance sheet. This involves the valuation of the balance sheet in 1 year's time. As for insurance liabilities, closed-form solutions to their value are generally not available, insurers turn to estimation procedures. While pure Monte Carlo simulation set-ups are theoretically sound, they are often infeasible in practice. Therefore, approximation methods are exploited. Among these, least squares Monte Carlo (LSMC) and portfolio replication are prominent and widely applied in practice. In this paper, we show that, while both are variants of regression-based Monte Carlo methods, they differ in one significant aspect. While the replicating portfolio approach only contains an approximation error, which converges to zero in the limit, in LSMC a projection error is additionally present, which cannot be eliminated. It is revealed that the replicating portfolio technique enjoys numerous advantages and is therefore an attractive model choice.

  11. Indexing molecules for their hERG liability.

    PubMed

    Rayan, Anwar; Falah, Mizied; Raiyn, Jamal; Da'adoosh, Beny; Kadan, Sleman; Zaid, Hilal; Goldblum, Amiram

    2013-07-01

    The human Ether-a-go-go-Related-Gene (hERG) potassium (K(+)) channel is liable to drug-inducing blockage that prolongs the QT interval of the cardiac action potential, triggers arrhythmia and possibly causes sudden cardiac death. Early prediction of drug liability to hERG K(+) channel is therefore highly important and preferably obligatory at earlier stages of any drug discovery process. In vitro assessment of drug binding affinity to hERG K(+) channel involves substantial expenses, time, and labor; and therefore computational models for predicting liabilities of drug candidates for hERG toxicity is of much importance. In the present study, we apply the Iterative Stochastic Elimination (ISE) algorithm to construct a large number of rule-based models (filters) and exploit their combination for developing the concept of hERG Toxicity Index (ETI). ETI estimates the molecular risk to be a blocker of hERG potassium channel. The area under the curve (AUC) of the attained model is 0.94. The averaged ETI of hERG binders, drugs from CMC, clinical-MDDR, endogenous molecules, ACD and ZINC, were found to be 9.17, 2.53, 3.3, -1.98, -2.49 and -3.86 respectively. Applying the proposed hERG Toxicity Index Model on external test set composed of more than 1300 hERG blockers picked from chEMBL shows excellent performance (Matthews Correlation Coefficient of 0.89). The proposed strategy could be implemented for the evaluation of chemicals in the hit/lead optimization stages of the drug discovery process, improve the selection of drug candidates as well as the development of safe pharmaceutical products. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  12. Comorbidity of depression with levels of smoking: an exploration of the shared familial risk hypothesis.

    PubMed

    Johnson, Eric O; Rhee, Soo Hyun; Chase, Gary A; Breslau, Naomi

    2004-12-01

    Comorbidity of depression and smoking is well recognized, but results from studies that have assessed alternative explanations have varied by the level of smoking and the study method. We examined all 13 etiology models of comorbidity described by Neale and Kendler (American Journal of Genetics, 57, 935-953, 1995) for depression and each of four levels of smoking to shed light on the role that differing definitions might have played in generating the conflicting findings. Data came from 979 young adults aged 26-35 years who participated in an epidemiological cohort study in southeastern Michigan. Respondent and family history data on parental smoking and depression were analyzed using the biometric modeling method for family data, which Rhee and colleagues (Journal of Child Psychology and Psychiatry and Allied Disciplines, 44, 612-636, 2003; Behavior Genetics, 34, 251-265, 2004) have shown to be valid more frequently than traditional prevalence analyses. Results of the biometric model fitting suggested that for ever smoking, the comorbidity with depression may be related to chance or a high liability threshold for smoking only. In contrast, a correlated liabilities model fit the data best for the comorbidity of depression with daily, heavy, and nicotine-dependent smoking. The familial correlations accounted for 73%-95% of the total variance shared between depression and these levels of smoking. These results differ from analyses of these data using a traditional prevalence approach, which found no evidence of shared familial liability. The conflicting findings of the studies that have examined the relationship between smoking and depression may be attributable to differences in definition of the disorders and the methods used to analyze them.

  13. Materials Related Forensic Analysis and Special Testing : Drying Shrinkage Evaluation of Bridge Decks with Class AAA and Class W/WD Type K Cement

    DOT National Transportation Integrated Search

    2001-07-01

    This work pertains to preparation of concrete drying shrinkage data for proposed concrete mixtures during normal concrete : trial batch verification. Selected concrete mixtures will include PennDOT Classes AAA and AA and will also include the use of ...

  14. Survival and growth of trees and shrubs on different lignite minesoils in Louisiana

    Treesearch

    James D. Haywood; Allan E. Tiarks; James P. Barnett

    1993-01-01

    In 1980, an experimental opencast lignite mine was developed to compare redistributed A horizon with three minesoil mixtures as growth media for woody plants. The three minesoil mixtures contained different amounts and types of overburden materials, and normal reclamation practices were followed. Loblolly pine (Pinus taeda, L.), sawtooth oak (

  15. Identifiability in N-mixture models: a large-scale screening test with bird data.

    PubMed

    Kéry, Marc

    2018-02-01

    Binomial N-mixture models have proven very useful in ecology, conservation, and monitoring: they allow estimation and modeling of abundance separately from detection probability using simple counts. Recently, doubts about parameter identifiability have been voiced. I conducted a large-scale screening test with 137 bird data sets from 2,037 sites. I found virtually no identifiability problems for Poisson and zero-inflated Poisson (ZIP) binomial N-mixture models, but negative-binomial (NB) models had problems in 25% of all data sets. The corresponding multinomial N-mixture models had no problems. Parameter estimates under Poisson and ZIP binomial and multinomial N-mixture models were extremely similar. Identifiability problems became a little more frequent with smaller sample sizes (267 and 50 sites), but were unaffected by whether the models did or did not include covariates. Hence, binomial N-mixture model parameters with Poisson and ZIP mixtures typically appeared identifiable. In contrast, NB mixtures were often unidentifiable, which is worrying since these were often selected by Akaike's information criterion. Identifiability of binomial N-mixture models should always be checked. If problems are found, simpler models, integrated models that combine different observation models or the use of external information via informative priors or penalized likelihoods, may help. © 2017 by the Ecological Society of America.

  16. Liability Insurance: A Primer for College and University Counsel.

    ERIC Educational Resources Information Center

    Ende, Howard; Anderson, Eugene R.; Crego, Susannah

    1997-01-01

    Because of the rise in litigation involving colleges and universities, basic information about liability insurance is provided. Administrators are warned that previously purchased liability insurance may not cover damages and losses incurred today, and that insurance companies often benefit from extended litigation. College counsel must understand…

  17. 20 CFR 416.2140 - Liability for erroneous Medicaid eligibility determinations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Liability for erroneous Medicaid eligibility... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Medicaid Eligibility Determinations § 416.2140 Liability for erroneous Medicaid eligibility determinations. If the State suffers any financial loss, directly...

  18. 14 CFR 1274.916 - Liability and risk of loss.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Liability and risk of loss. 1274.916 Section 1274.916 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION COOPERATIVE AGREEMENTS..., or indemnification of, developers of experimental aerospace vehicles. Liability and Risk of Loss July...

  19. 14 CFR Sec. 2-5 - Revenue and accounting practices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... date of all subaccounts supporting the Air Traffic Liability control account; the subsidiary trial balance must agree with the Air Traffic Liability control account or a reconciliation statement furnished... ledger. If the sales listing is not in agreement with the Air Traffic Liability control account, the...

  20. 47 CFR 32.3999 - Instructions for balance sheet accounts-liabilities and stockholders' equity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Instructions for balance sheet accounts-liabilities and stockholders' equity. 32.3999 Section 32.3999 Telecommunication FEDERAL COMMUNICATIONS... Instructions for Balance Sheet Accounts § 32.3999 Instructions for balance sheet accounts—liabilities and...

  1. 47 CFR 32.3999 - Instructions for balance sheet accounts-liabilities and stockholders' equity.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Instructions for balance sheet accounts-liabilities and stockholders' equity. 32.3999 Section 32.3999 Telecommunication FEDERAL COMMUNICATIONS... Instructions for Balance Sheet Accounts § 32.3999 Instructions for balance sheet accounts—liabilities and...

  2. Warning! Slippery Road Ahead: Internet Access and District Liability.

    ERIC Educational Resources Information Center

    Mazur, Joan M.

    1995-01-01

    As schools merge onto the information highway, districts must address their liability associated with Internet access. Schools need a practical policy supporting high access to global educational resources while limiting district liability. USENET provides easy access to controversial and pornographic materials. This article outlines federal…

  3. Personal Malpractice Liability of Reference Librarians and Information Brokers.

    ERIC Educational Resources Information Center

    Gray, John A.

    1988-01-01

    Reviews common law contract and tort bases for malpractice liability and their applicability to reference librarians, special librarians, and information brokers. The discussion covers the legal bases for professional malpractice liability, the librarian-patron relationship, the likelihood of lawsuits, and the need for personal liability…

  4. 42 CFR 438.106 - Liability for payment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Liability for payment. 438.106 Section 438.106 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Enrollee Rights and Protections § 438.106 Liability for...

  5. 48 CFR 3028.311 - Solicitation provision and contract clause on liability insurance under cost-reimbursement...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Solicitation provision and contract clause on liability insurance under cost-reimbursement contracts. 3028.311 Section 3028.311... contract clause on liability insurance under cost-reimbursement contracts. ...

  6. A Gaussian Mixture Model Representation of Endmember Variability in Hyperspectral Unmixing

    NASA Astrophysics Data System (ADS)

    Zhou, Yuan; Rangarajan, Anand; Gader, Paul D.

    2018-05-01

    Hyperspectral unmixing while considering endmember variability is usually performed by the normal compositional model (NCM), where the endmembers for each pixel are assumed to be sampled from unimodal Gaussian distributions. However, in real applications, the distribution of a material is often not Gaussian. In this paper, we use Gaussian mixture models (GMM) to represent the endmember variability. We show, given the GMM starting premise, that the distribution of the mixed pixel (under the linear mixing model) is also a GMM (and this is shown from two perspectives). The first perspective originates from the random variable transformation and gives a conditional density function of the pixels given the abundances and GMM parameters. With proper smoothness and sparsity prior constraints on the abundances, the conditional density function leads to a standard maximum a posteriori (MAP) problem which can be solved using generalized expectation maximization. The second perspective originates from marginalizing over the endmembers in the GMM, which provides us with a foundation to solve for the endmembers at each pixel. Hence, our model can not only estimate the abundances and distribution parameters, but also the distinct endmember set for each pixel. We tested the proposed GMM on several synthetic and real datasets, and showed its potential by comparing it to current popular methods.

  7. [Liability for medical malpractice: an economic approach].

    PubMed

    Carles, M

    2003-01-01

    In recent years, changes in the organization of healthcare institutions and the increased number of medical malpractice claims have revealed the need to study the concept of medical responsibility and the repercussion of these changes on healthcare provision. To date, discussion has focussed on legal aspects and economic implications have been largely ignored.The present article reviews studies that have performed an economic analysis the subject. Firstly, we examine studies that gradually introduce the concepts of uncertainty, risk aversion and moral hazard. Secondly, in the healthcare environment, we pay particular attention to models that include new arguments on professionals' objective duties or to bargaining models when there is asymmetric information. Finally, we consider the medical malpractice insurance market and investigate how reputation and the possibilities of exercising defensive medicine influence healthcare provision. Our analysis suggests that, due to the characteristics of the healthcare market, the models proposed by the economy of information are very useful for performing economic analyses of liability in medical malpractice. However, alternative hypotheses also need to be formulated so that these models can be adapted to the specific characteristics of different health systems.

  8. Modeling abundance using multinomial N-mixture models

    USGS Publications Warehouse

    Royle, Andy

    2016-01-01

    Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.

  9. Predicting the emetic liability of novel chemical entities: a comparative study.

    PubMed

    du Sert, Nathalie Percie; Holmes, Anthony M; Wallis, Rob; Andrews, Paul Lr

    2012-03-01

    Emesis is a multi-system reflex, which is usually investigated using in vivo models. The aim of the study is to compare the response induced by emetic compounds across species and investigate whether dogs, ferrets and rats are all similarly predictive of humans. A systematic review was carried out and relevant publications were identified from PubMed. The search was restricted to four species (human, dog, ferret, rat) and ten compounds representative of various mechanisms of emesis induction (apomorphine, cisplatin, cholecystokinin octapeptide, copper sulphate, cyclophosphamide, ipecacuanha, lithium chloride, morphine, nicotine, rolipram). 1046 publications were reviewed, and 311 were included, the main reason for exclusion was the lack of quantitative data. Emetic or pica data were extracted as incidence, intensity or latency. All three animal species identified emetic liability but interspecies differences for dose sensitivity were detected. These results suggest that emetic liability can be reliably identified in a common laboratory species such as the rat. However, to evaluate the characteristics of the emetic response, no animal species is a universal predictor of emetic liability and the choice of species should be an informed decision based on the type of compound investigated. Limitations relating to the conduct and reporting of emesis studies were identified, the main ones being the lack of comparable outcome measures between human and animal data, and the limited availability of human data in the public domain. © 2011 The Authors. British Journal of Pharmacology © 2011 The British Pharmacological Society.

  10. Baseline Correction of Diffuse Reflection Near-Infrared Spectra Using Searching Region Standard Normal Variate (SRSNV).

    PubMed

    Genkawa, Takuma; Shinzawa, Hideyuki; Kato, Hideaki; Ishikawa, Daitaro; Murayama, Kodai; Komiyama, Makoto; Ozaki, Yukihiro

    2015-12-01

    An alternative baseline correction method for diffuse reflection near-infrared (NIR) spectra, searching region standard normal variate (SRSNV), was proposed. Standard normal variate (SNV) is an effective pretreatment method for baseline correction of diffuse reflection NIR spectra of powder and granular samples; however, its baseline correction performance depends on the NIR region used for SNV calculation. To search for an optimal NIR region for baseline correction using SNV, SRSNV employs moving window partial least squares regression (MWPLSR), and an optimal NIR region is identified based on the root mean square error (RMSE) of cross-validation of the partial least squares regression (PLSR) models with the first latent variable (LV). The performance of SRSNV was evaluated using diffuse reflection NIR spectra of mixture samples consisting of wheat flour and granular glucose (0-100% glucose at 5% intervals). From the obtained NIR spectra of the mixture in the 10 000-4000 cm(-1) region at 4 cm intervals (1501 spectral channels), a series of spectral windows consisting of 80 spectral channels was constructed, and then SNV spectra were calculated for each spectral window. Using these SNV spectra, a series of PLSR models with the first LV for glucose concentration was built. A plot of RMSE versus the spectral window position obtained using the PLSR models revealed that the 8680–8364 cm(-1) region was optimal for baseline correction using SNV. In the SNV spectra calculated using the 8680–8364 cm(-1) region (SRSNV spectra), a remarkable relative intensity change between a band due to wheat flour at 8500 cm(-1) and that due to glucose at 8364 cm(-1) was observed owing to successful baseline correction using SNV. A PLSR model with the first LV based on the SRSNV spectra yielded a determination coefficient (R2) of 0.999 and an RMSE of 0.70%, while a PLSR model with three LVs based on SNV spectra calculated in the full spectral region gave an R2 of 0.995 and an RMSE of 2.29%. Additional evaluation of SRSNV was carried out using diffuse reflection NIR spectra of marzipan and corn samples, and PLSR models based on SRSNV spectra showed good prediction results. These evaluation results indicate that SRSNV is effective in baseline correction of diffuse reflection NIR spectra and provides regression models with good prediction accuracy.

  11. 26 CFR 1.401-2 - Impossibility of diversion under the trust instrument.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... thereafter before the satisfaction of all liabilities to employees or their beneficiaries covered by the... not solely designed for the proper satisfaction of all liabilities to employees or their beneficiaries... phrase “prior to the satisfaction of all liabilities with respect to employees and their beneficiaries...

  12. Understanding Tort Liability and Its Relationship to Extension Professionals.

    ERIC Educational Resources Information Center

    Long, Norman D.; And Others

    This study focuses on tort liability and its relationship to extension professionals working with 4-H programs. Tort liability as related to extension professionals consists of ten components: due care, physical defects (inspection of premises), instruction and supervision, first aid and medical treatment, foreseeability, causation, defamation,…

  13. 12 CFR 205.6 - Liability of consumer for unauthorized transfers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... transfers. 205.6 Section 205.6 Banks and Banking FEDERAL RESERVE SYSTEM BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.6 Liability of consumer for unauthorized transfers. (a) Conditions for liability. A consumer may be held liable, within the limitations described in...

  14. 12 CFR 205.6 - Liability of consumer for unauthorized transfers.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... transfers. 205.6 Section 205.6 Banks and Banking FEDERAL RESERVE SYSTEM BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.6 Liability of consumer for unauthorized transfers. (a) Conditions for liability. A consumer may be held liable, within the limitations described in...

  15. 32 CFR 536.123 - Limitation of liability for maritime claims.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Limitation of liability for maritime claims. 536... AND ACCOUNTS CLAIMS AGAINST THE UNITED STATES Maritime Claims § 536.123 Limitation of liability for maritime claims. For admiralty claims arising within the United States under the provisions of the...

  16. HIV liability & disability services providers: an introduction to tort principles.

    PubMed

    Harvey, D C; Decker, C L

    1991-08-01

    Mental health and developmental disability services providers are concerned that liability issues regarding worker and client exposure to HIV have not been adequately addressed. By developing policy specifically in the areas of education, infection control practices, and confidentiality, providers may minimize findings of liability and protect patient rights.

  17. 18 CFR 367.2440 - Account 244, Derivative instrument liabilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Account 244, Derivative instrument liabilities. 367.2440 Section 367.2440 Conservation of Power and Water Resources FEDERAL ENERGY..., Derivative instrument liabilities. This account must include the change in the fair value of all derivative...

  18. 18 CFR 367.2440 - Account 244, Derivative instrument liabilities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Account 244, Derivative instrument liabilities. 367.2440 Section 367.2440 Conservation of Power and Water Resources FEDERAL ENERGY..., Derivative instrument liabilities. This account must include the change in the fair value of all derivative...

  19. 26 CFR 20.2206-1 - Liability of life insurance beneficiaries.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 14 2010-04-01 2010-04-01 false Liability of life insurance beneficiaries. 20.2206-1 Section 20.2206-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... § 20.2206-1 Liability of life insurance beneficiaries. With respect to the right of the district...

  20. 26 CFR 20.2206-1 - Liability of life insurance beneficiaries.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 14 2011-04-01 2010-04-01 true Liability of life insurance beneficiaries. 20.2206-1 Section 20.2206-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... § 20.2206-1 Liability of life insurance beneficiaries. With respect to the right of the district...

  1. 48 CFR 1852.228-81 - Insurance-Partial Immunity From Tort Liability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Insurance-Partial Immunity... Provisions and Clauses 1852.228-81 Insurance—Partial Immunity From Tort Liability. As prescribed in 1828.311-270(c), insert the following clause: Insurance—Partial Immunity From Tort Liability (SEP 2000) (a...

  2. 48 CFR 1852.228-81 - Insurance-Partial Immunity From Tort Liability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Insurance-Partial Immunity... Provisions and Clauses 1852.228-81 Insurance—Partial Immunity From Tort Liability. As prescribed in 1828.311-270(c), insert the following clause: Insurance—Partial Immunity From Tort Liability (SEP 2000) (a...

  3. 48 CFR 1852.228-81 - Insurance-Partial Immunity From Tort Liability.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Insurance-Partial Immunity... Provisions and Clauses 1852.228-81 Insurance—Partial Immunity From Tort Liability. As prescribed in 1828.311-270(c), insert the following clause: Insurance—Partial Immunity From Tort Liability (SEP 2000) (a...

  4. 48 CFR 1852.228-81 - Insurance-Partial Immunity From Tort Liability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Insurance-Partial Immunity... Provisions and Clauses 1852.228-81 Insurance—Partial Immunity From Tort Liability. As prescribed in 1828.311-270(c), insert the following clause: Insurance—Partial Immunity From Tort Liability (SEP 2000) (a...

  5. 48 CFR 1852.228-81 - Insurance-Partial Immunity From Tort Liability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Insurance-Partial Immunity... Provisions and Clauses 1852.228-81 Insurance—Partial Immunity From Tort Liability. As prescribed in 1828.311-270(c), insert the following clause: Insurance—Partial Immunity From Tort Liability (SEP 2000) (a...

  6. 77 FR 69677 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Changes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-20

    ... executive officer of such limited liability company, or (4) a general partner in an OTP Firm partnership...). \\7\\ ``OTP Firm'' means a sole proprietorship, partnership, corporation, limited liability company, or... Firms. \\13\\ An ``ETP Holder'' is a sole proprietorship, partnership, corporation, limited liability...

  7. 26 CFR 1.465-27 - Qualified nonrecourse financing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... partnership; incidental property. (i) X is a limited liability company that is classified as a partnership for.... (i) UTP1 and UTP2, both limited liability companies classified as partnerships, are the only general... as qualified nonrecourse financing. (4) Partnership liability. For purposes of section 465(b)(6) and...

  8. 20 CFR 410.563 - Liability of a certifying officer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Liability of a certifying officer. 410.563 Section 410.563 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL COAL MINE HEALTH AND SAFETY ACT OF 1969, TITLE IV-BLACK LUNG BENEFITS (1969- ) Payment of Benefits § 410.563 Liability of a...

  9. 17 CFR 230.414 - Registration by certain successor issuers.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... succession the successor issuer had no assets or liabilities other than nominal assets or liabilities; (b) The succession was effected by a merger or similar succession pursuant to statutory provisions or the... all of the liabilities and obligations of the predecessor issuer; (c) The succession was approved by...

  10. 17 CFR 230.414 - Registration by certain successor issuers.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... succession the successor issuer had no assets or liabilities other than nominal assets or liabilities; (b) The succession was effected by a merger or similar succession pursuant to statutory provisions or the... all of the liabilities and obligations of the predecessor issuer; (c) The succession was approved by...

  11. 17 CFR 230.414 - Registration by certain successor issuers.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... succession the successor issuer had no assets or liabilities other than nominal assets or liabilities; (b) The succession was effected by a merger or similar succession pursuant to statutory provisions or the... all of the liabilities and obligations of the predecessor issuer; (c) The succession was approved by...

  12. 17 CFR 230.414 - Registration by certain successor issuers.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... succession the successor issuer had no assets or liabilities other than nominal assets or liabilities; (b) The succession was effected by a merger or similar succession pursuant to statutory provisions or the... all of the liabilities and obligations of the predecessor issuer; (c) The succession was approved by...

  13. 17 CFR 230.414 - Registration by certain successor issuers.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... succession the successor issuer had no assets or liabilities other than nominal assets or liabilities; (b) The succession was effected by a merger or similar succession pursuant to statutory provisions or the... all of the liabilities and obligations of the predecessor issuer; (c) The succession was approved by...

  14. 34 CFR 686.34 - Liability for and recovery of TEACH Grant overpayments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION (CONTINUED) TEACHER EDUCATION ASSISTANCE FOR COLLEGE AND HIGHER EDUCATION (TEACH) GRANT PROGRAM Administration of Grant Payments § 686.34 Liability for... 34 Education 4 2011-07-01 2011-07-01 false Liability for and recovery of TEACH Grant overpayments...

  15. 34 CFR 686.34 - Liability for and recovery of TEACH Grant overpayments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TEACHER EDUCATION ASSISTANCE FOR COLLEGE AND HIGHER EDUCATION (TEACH) GRANT PROGRAM Administration of Grant Payments § 686.34 Liability for and... 34 Education 3 2010-07-01 2010-07-01 false Liability for and recovery of TEACH Grant overpayments...

  16. 7 CFR 1767.19 - Liabilities and other credits.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... furnish complete information concerning each note and open account. 224Other Long-Term Debt A. This... this account shall be kept in such a manner that the utility can furnish full information as to the... Accounts § 1767.19 Liabilities and other credits. The liabilities and other credit accounts identified in...

  17. 18 CFR 367.2420 - Account 242, Miscellaneous current and accrued liabilities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., FEDERAL POWER ACT AND NATURAL GAS ACT Balance Sheet Chart of Accounts Current and Accrued Liabilities... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Account 242, Miscellaneous current and accrued liabilities. 367.2420 Section 367.2420 Conservation of Power and Water...

  18. 31 CFR 315.56 - General instructions and liability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false General instructions and liability. 315.56 Section 315.56 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued... and, where one is available, a corporate stamp or issuing or paying agent's stamp. (b) Liability. The...

  19. 26 CFR 1.312-3 - Liabilities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 4 2011-04-01 2011-04-01 false Liabilities. 1.312-3 Section 1.312-3 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Effects on Corporation § 1.312-3 Liabilities. The amount of any reductions in earnings and profits...

  20. 26 CFR 1.312-3 - Liabilities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Liabilities. 1.312-3 Section 1.312-3 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Effects on Corporation § 1.312-3 Liabilities. The amount of any reductions in earnings and profits...

  1. Civil Liability of Schools, Teachers and Pupils for Careless Behaviour.

    ERIC Educational Resources Information Center

    Wenham, David

    1999-01-01

    Identifies elements that (British) courts consider in school or teacher negligence cases. Reviews significant case law establishing liability of schools and teachers for harm sustained by pupils and children's personal liability for careless acts leading to personal harm. Discusses implications of a recent child negligence case. (Contains 14…

  2. 78 FR 49242 - Relief From Joint and Several Liability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-13

    ... Relief From Joint and Several Liability AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Notice... joint and several tax liability under section 6015 of the Internal Revenue Code (Code) and relief from... are husband and wife to file a joint Federal income tax return. Married individuals who choose to file...

  3. 7 CFR 1717.151 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., liabilities, franchises and powers of those passing out of existence; (2) A merger where one company is... its own identity and acquiring the assets, liabilities, franchises and powers of the former; or (3) A... entirety the assets, liabilities, franchises, and powers of the transferor. New loan means a loan to a...

  4. 7 CFR 1717.151 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., liabilities, franchises and powers of those passing out of existence; (2) A merger where one company is... its own identity and acquiring the assets, liabilities, franchises and powers of the former; or (3) A... entirety the assets, liabilities, franchises, and powers of the transferor. New loan means a loan to a...

  5. 7 CFR 1717.151 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., liabilities, franchises and powers of those passing out of existence; (2) A merger where one company is... its own identity and acquiring the assets, liabilities, franchises and powers of the former; or (3) A... entirety the assets, liabilities, franchises, and powers of the transferor. New loan means a loan to a...

  6. 7 CFR 1717.151 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., liabilities, franchises and powers of those passing out of existence; (2) A merger where one company is... its own identity and acquiring the assets, liabilities, franchises and powers of the former; or (3) A... entirety the assets, liabilities, franchises, and powers of the transferor. New loan means a loan to a...

  7. 7 CFR 1717.151 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., liabilities, franchises and powers of those passing out of existence; (2) A merger where one company is... its own identity and acquiring the assets, liabilities, franchises and powers of the former; or (3) A... entirety the assets, liabilities, franchises, and powers of the transferor. New loan means a loan to a...

  8. Liability Insurance in California Public Schools.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    In the mid-1970s, an increased number of high-cost liability lawsuits combined with other financial difficulties insurance companies were experiencing to cause drastic increases in insurance rates for schools and a reluctance on the part of insurance carriers to provide liability coverage. Questionnaires returned by county and district school…

  9. 31 CFR 321.15 - Liability for losses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Liability for losses. 321.15 Section... INSTITUTIONS OF UNITED STATES SAVINGS BONDS AND UNITED STATES SAVINGS NOTES (FREEDOM SHARES) Losses Resulting From Erroneous Payments § 321.15 Liability for losses. Under the governing statute, as amended (31 U.S...

  10. Liability.

    ERIC Educational Resources Information Center

    Hollander, Patricia A.

    Tort liability covers most injurious, civil, wrongful acts that occur between individuals. For tort liability to exist, four elements must be present: a duty to use due care, a breach of that duty, a direct causal relationship between the conduct complained of and the injury suffered, and proof of actual injury. Recent court cases involving tort…

  11. 26 CFR 1.357-2 - Liabilities in excess of basis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... exchange as to which under section 357(b) (relating to assumption of liabilities for tax-avoidance purposes... 1.357-2 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Effects on Shareholders and Security Holders § 1.357-2 Liabilities in excess of...

  12. 76 FR 69320 - Agency Request for Reinstatement of a Previously Approved Information Collection(s): Aircraft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-08

    ... Previously Approved Information Collection(s): Aircraft Accident Liability Insurance AGENCY: Office of the...: Aircraft Accident Liability Insurance. Form Numbers: OST Forms 6410 and 6411. Type of Review: Reinstatement... air carrier accident liability insurance to protect the public from losses. This insurance information...

  13. 26 CFR 20.2206-1 - Liability of life insurance beneficiaries.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 14 2014-04-01 2013-04-01 true Liability of life insurance beneficiaries. 20.2206-1 Section 20.2206-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... § 20.2206-1 Liability of life insurance beneficiaries. With respect to the right of the district...

  14. 26 CFR 20.2206-1 - Liability of life insurance beneficiaries.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 14 2013-04-01 2013-04-01 false Liability of life insurance beneficiaries. 20.2206-1 Section 20.2206-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... § 20.2206-1 Liability of life insurance beneficiaries. With respect to the right of the district...

  15. 26 CFR 20.2206-1 - Liability of life insurance beneficiaries.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 14 2012-04-01 2012-04-01 false Liability of life insurance beneficiaries. 20.2206-1 Section 20.2206-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... § 20.2206-1 Liability of life insurance beneficiaries. With respect to the right of the district...

  16. Colleges' New Liabilities: An Emerging New In Loco Parentis.

    ERIC Educational Resources Information Center

    Gibbs, Annette; Szablewicz, James J.

    1988-01-01

    Describes and documents the changing legal theories about the college-student relationship currently used by the courts. Notes that the most recent legal actions focus on contract law, landowner liability, guest and host, and negligence. Looks specifically at cases involving liability for sexual attacks on students and for alcohol-related…

  17. 49 CFR 1152.29 - Prospective use of rights-of-way for interim trail use and rail banking.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... liability arising out of the use of the right-of-way (unless the user is immune from liability, in which... arising out of the transfer or use of (unless the user is immune from liability, in which case it need...

  18. 48 CFR 1812.301 - Solicitation provisions and contract clauses for the acquisition of commercial items. (NASA...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-Waiver of Liability for Science or Space Exploration Activities unrelated to the International Space....301 Section 1812.301 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND SPACE..., Cross-Waiver of Liability for Space Shuttle Services. (L) 1852.228-76, Cross-Waiver of Liability for...

  19. 33 CFR 153.405 - Liability to the pollution fund.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Liability to the pollution fund... (CONTINUED) POLLUTION CONTROL OF POLLUTION BY OIL AND HAZARDOUS SUBSTANCES, DISCHARGE REMOVAL Administration of the Pollution Fund § 153.405 Liability to the pollution fund. The owner or operator of the vessel...

  20. 33 CFR 153.405 - Liability to the pollution fund.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Liability to the pollution fund... (CONTINUED) POLLUTION CONTROL OF POLLUTION BY OIL AND HAZARDOUS SUBSTANCES, DISCHARGE REMOVAL Administration of the Pollution Fund § 153.405 Liability to the pollution fund. The owner or operator of the vessel...

  1. 42 CFR 422.132 - Protection against liability and loss of benefits.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM MEDICARE ADVANTAGE PROGRAM Benefits and Beneficiary Protections § 422.132 Protection against liability and loss of benefits. Enrollees of MA organizations are... 42 Public Health 3 2010-10-01 2010-10-01 false Protection against liability and loss of benefits...

  2. Grounding the Management of Liabilities in the Risk Analysis Framework

    ERIC Educational Resources Information Center

    Phillips, Peter W. B.; Smyth, Stuart

    2007-01-01

    Discussions of socioeconomic liability and compensation must necessarily start from an understanding of the socioeconomic, legal, and scientific basis for identifying, assessing, managing, and apportioning blame for hazards related to innovations. Public discussions about the nature of the liability challenge related to genetically modified (GM)…

  3. 33 CFR 153.405 - Liability to the pollution fund.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Liability to the pollution fund... (CONTINUED) POLLUTION CONTROL OF POLLUTION BY OIL AND HAZARDOUS SUBSTANCES, DISCHARGE REMOVAL Administration of the Pollution Fund § 153.405 Liability to the pollution fund. The owner or operator of the vessel...

  4. 33 CFR 153.405 - Liability to the pollution fund.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Liability to the pollution fund... (CONTINUED) POLLUTION CONTROL OF POLLUTION BY OIL AND HAZARDOUS SUBSTANCES, DISCHARGE REMOVAL Administration of the Pollution Fund § 153.405 Liability to the pollution fund. The owner or operator of the vessel...

  5. 33 CFR 153.405 - Liability to the pollution fund.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Liability to the pollution fund... (CONTINUED) POLLUTION CONTROL OF POLLUTION BY OIL AND HAZARDOUS SUBSTANCES, DISCHARGE REMOVAL Administration of the Pollution Fund § 153.405 Liability to the pollution fund. The owner or operator of the vessel...

  6. 12 CFR 205.6 - Liability of consumer for unauthorized transfers.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... transfers. 205.6 Section 205.6 Banks and Banking FEDERAL RESERVE SYSTEM BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.6 Liability of consumer for unauthorized transfers. (a) Conditions for liability. A consumer may be held liable, within the limitations described in...

  7. 12 CFR 205.6 - Liability of consumer for unauthorized transfers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... transfers. 205.6 Section 205.6 Banks and Banking FEDERAL RESERVE SYSTEM BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.6 Liability of consumer for unauthorized transfers. (a) Conditions for liability. A consumer may be held liable, within the limitations described in...

  8. 12 CFR 205.6 - Liability of consumer for unauthorized transfers.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... transfers. 205.6 Section 205.6 Banks and Banking FEDERAL RESERVE SYSTEM BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.6 Liability of consumer for unauthorized transfers. (a) Conditions for liability. A consumer may be held liable, within the limitations described in...

  9. 27 CFR 31.234 - Liability for special (occupational) tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... special (occupational) tax in accordance with the laws and regulations in effect at that time. The tax... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Liability for special... Liability for special (occupational) tax. The special (occupational) tax on alcohol beverage dealers was...

  10. Overcoming Legal Liability Concerns for School-Based Physical Activity Promotion

    PubMed Central

    Zimmerman, Sara; Kramer, Karen

    2013-01-01

    Schools have been identified as a priority environment for physical activity promotion as a component of efforts to help prevent childhood obesity. A variety of school-based environmental and programmatic strategies have been proven effective in promoting physical activity both on-site and in the surrounding community. However, many schools are deterred by fears of increased risk of legal liability for personal injuries. We examine 3 school-based strategies for promoting physical activity—Safe Routes to School programs, joint use agreements, and playground enhancement—from a tort liability perspective, and describe how schools can substantially minimize any associated liability risk through injury prevention and other strategies. We also recommend approaches to help schools overcome their liability concerns and adopt these critically needed healthy school policies. PMID:24028226

  11. Assessment of substance abuse liability in rodents: self-administration, drug discrimination, and locomotor sensitization.

    PubMed

    Paterson, Neil E

    2012-09-01

    Assessing abuse liability is a crucial step in the development of a novel chemical entity (NCE) with central nervous system (CNS) activity or with chemical or pharmacological properties in common with known abused substances. Rodent assessment of abuse liability is highly attractive due to its relatively low cost and high predictive validity. Described in this unit are three rodent assays commonly used to provide data on the potential for abuse liability based on the acute effects of NCEs: specifically, self-administration, drug discrimination, and locomotor sensitization. As these assays provide insight into the potential abuse liability of NCEs as well as in vivo pharmacological mechanism(s) of action, they should form a key part of the development process for novel therapeutics aimed at treating CNS disorders.

  12. A mixture of extracts from Peruvian plants (black maca and yacon) improves sperm count and reduced glycemia in mice with streptozotocin-induced diabetes.

    PubMed

    Gonzales, Gustavo F; Gonzales-Castañeda, Cynthia; Gasco, Manuel

    2013-09-01

    We investigated the effect of two extracts from Peruvian plants given alone or in a mixture on sperm count and glycemia in streptozotocin-diabetic mice. Normal or diabetic mice were divided in groups receiving vehicle, black maca (Lepidium meyenii), yacon (Smallanthus sonchifolius) or three mixtures of extracts black maca/yacon (90/10, 50/50 and 10/90%). Normal or diabetic mice were treated for 7 d with each extract, mixture or vehicle. Glycemia, daily sperm production (DSP), epididymal and vas deferens sperm counts in mice and polyphenol content, and antioxidant activity in each extract were assessed. Black maca (BM), yacon and the mixture of extracts reduced glucose levels in diabetic mice. Non-diabetic mice treated with BM and yacon showed higher DSP than those treated with vehicle (p < 0.05). Diabetic mice treated with BM, yacon and the mixture maca/yacon increased DSP, and sperm count in vas deferens and epididymis with respect to non-diabetic and diabetic mice treated with vehicle (p < 0.05). Yacon has 3.05 times higher polyphenol content than in maca, and this was associated with higher antioxidant activity. The combination of two extracts improved glycemic levels and male reproductive function in diabetic mice. Streptozotocin increased 1.43 times the liver weight that was reversed with the assessed plants extracts. In summary, streptozotocin-induced diabetes resulted in reduction in sperm counts and liver damage. These effects could be reduced with BM, yacon and the BM+yacon mixture.

  13. Prevention of propofol injection pain in children: a comparison of pretreatment with tramadol and propofol-lidocaine mixture.

    PubMed

    Borazan, Hale; Sahin, Osman; Kececioglu, Ahmet; Uluer, M Selcuk; Et, Tayfun; Otelcioglu, Seref

    2012-01-01

    The pain on propofol injection is considered to be a common and difficult to eliminate problem in children. In this study, we aimed to compare the efficacy of pretreatment with tramadol 1 mg.kg(-1)and propofol-lidocaine 20 mg mixture for prevention of propofol induced pain in children. One hundred and twenty ASA I-II patients undergoing orthopedic and otolaryngological surgery were included in this study and were divided into three groups with random table numbers. Group C (n=39) received normal saline placebo and Group T (n=40) received 1 mg.kg(-1) tramadol 60 sec before propofol (180 mg 1% propofol with 2 ml normal saline) whereas Group L (n=40) received normal saline placebo before propofol-lidocaine mixture (180 mg 1% propofol with 2 ml %1 lidocaine). One patient in Group C was dropped out from the study because of difficulty in inserting an iv cannula. Thus, one hundred and nineteen patients were analyzed for the study. After given the calculated dose of propofol, a blinded observer assessed the pain with a four-point behavioral scale. There were no significant differences in patient characteristics and intraoperative variables (p>0.05) except intraoperative fentanyl consumption and analgesic requirement one hr after surgery among the groups (p<0.05). Both tramadol 1 mg.kg(-1) and lidocaine 20 mg mixture significantly reduced propofol pain when compared with control group. Moderate and severe pain were found higher in control group (p<0.05). The incidence of overall pain was 79.4% in the control group, 35% in tramadol group, 25% in lidocaine group respectively (p<0.001). Pretreatment with tramadol 60 sec before propofol injection and propofol-lidocaine mixture were significantly reduced propofol injection pain when compared to placebo in children.

  14. Ultrasonographic findings in hereditary neuropathy with liability to pressure palsies.

    PubMed

    Bayrak, Ayse O; Bayrak, Ilkay Koray; Battaloglu, Esra; Ozes, Burcak; Yildiz, Onur; Onar, Musa Kazim

    2015-02-01

    The aims of this study were to evaluate the sonographic findings of patients with hereditary neuropathy with liability to pressure palsies (HNPP) and to examine the correlation between sonographic and electrophysiological findings. Nine patients whose electrophysiological findings indicated HNPP and whose diagnosis was confirmed by genetic analysis were enrolled in the study. The median, ulnar, peroneal, and tibial nerves were evaluated by ultrasonography. We ultrasonographically evaluated 18 median, ulnar, peroneal, and tibial nerves. Nerve enlargement was identified in the median, ulnar, and peroneal nerves at the typical sites of compression. None of the patients had nerve enlargement at a site of noncompression. None of the tibial nerves had increased cross-sectional area (CSA) values. There were no significant differences in median, ulnar, and peroneal nerve distal motor latencies (DMLs) between the patients with an increased CSA and those with a normal CSA. In most cases, there was no correlation between electrophysiological abnormalities and clinical or sonographic findings. Although multiple nerve enlargements at typical entrapment sites on sonographic evaluation can suggest HNPP, ultrasonography cannot be used as a diagnostic tool for HNPP. Ultrasonography may contribute to the differential diagnosis of HNPP and other demyelinating polyneuropathies or compression neuropathies; however, further studies are required.

  15. Concentration addition and independent action model: Which is better in predicting the toxicity for metal mixtures on zebrafish larvae.

    PubMed

    Gao, Yongfei; Feng, Jianfeng; Kang, Lili; Xu, Xin; Zhu, Lin

    2018-01-01

    The joint toxicity of chemical mixtures has emerged as a popular topic, particularly on the additive and potential synergistic actions of environmental mixtures. We investigated the 24h toxicity of Cu-Zn, Cu-Cd, and Cu-Pb and 96h toxicity of Cd-Pb binary mixtures on the survival of zebrafish larvae. Joint toxicity was predicted and compared using the concentration addition (CA) and independent action (IA) models with different assumptions in the toxic action mode in toxicodynamic processes through single and binary metal mixture tests. Results showed that the CA and IA models presented varying predictive abilities for different metal combinations. For the Cu-Cd and Cd-Pb mixtures, the CA model simulated the observed survival rates better than the IA model. By contrast, the IA model simulated the observed survival rates better than the CA model for the Cu-Zn and Cu-Pb mixtures. These findings revealed that the toxic action mode may depend on the combinations and concentrations of tested metal mixtures. Statistical analysis of the antagonistic or synergistic interactions indicated that synergistic interactions were observed for the Cu-Cd and Cu-Pb mixtures, non-interactions were observed for the Cd-Pb mixtures, and slight antagonistic interactions for the Cu-Zn mixtures. These results illustrated that the CA and IA models are consistent in specifying the interaction patterns of binary metal mixtures. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Concentration Addition, Independent Action and Generalized Concentration Addition Models for Mixture Effect Prediction of Sex Hormone Synthesis In Vitro

    PubMed Central

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael; Nellemann, Christine; Hass, Ulla; Vinggaard, Anne Marie

    2013-01-01

    Humans are concomitantly exposed to numerous chemicals. An infinite number of combinations and doses thereof can be imagined. For toxicological risk assessment the mathematical prediction of mixture effects, using knowledge on single chemicals, is therefore desirable. We investigated pros and cons of the concentration addition (CA), independent action (IA) and generalized concentration addition (GCA) models. First we measured effects of single chemicals and mixtures thereof on steroid synthesis in H295R cells. Then single chemical data were applied to the models; predictions of mixture effects were calculated and compared to the experimental mixture data. Mixture 1 contained environmental chemicals adjusted in ratio according to human exposure levels. Mixture 2 was a potency adjusted mixture containing five pesticides. Prediction of testosterone effects coincided with the experimental Mixture 1 data. In contrast, antagonism was observed for effects of Mixture 2 on this hormone. The mixtures contained chemicals exerting only limited maximal effects. This hampered prediction by the CA and IA models, whereas the GCA model could be used to predict a full dose response curve. Regarding effects on progesterone and estradiol, some chemicals were having stimulatory effects whereas others had inhibitory effects. The three models were not applicable in this situation and no predictions could be performed. Finally, the expected contributions of single chemicals to the mixture effects were calculated. Prochloraz was the predominant but not sole driver of the mixtures, suggesting that one chemical alone was not responsible for the mixture effects. In conclusion, the GCA model seemed to be superior to the CA and IA models for the prediction of testosterone effects. A situation with chemicals exerting opposing effects, for which the models could not be applied, was identified. In addition, the data indicate that in non-potency adjusted mixtures the effects cannot always be accounted for by single chemicals. PMID:23990906

  17. Non-linear models for the detection of impaired cerebral blood flow autoregulation.

    PubMed

    Chacón, Max; Jara, José Luis; Miranda, Rodrigo; Katsogridakis, Emmanuel; Panerai, Ronney B

    2018-01-01

    The ability to discriminate between normal and impaired dynamic cerebral autoregulation (CA), based on measurements of spontaneous fluctuations in arterial blood pressure (BP) and cerebral blood flow (CBF), has considerable clinical relevance. We studied 45 normal subjects at rest and under hypercapnia induced by breathing a mixture of carbon dioxide and air. Non-linear models with BP as input and CBF velocity (CBFV) as output, were implemented with support vector machines (SVM) using separate recordings for learning and validation. Dynamic SVM implementations used either moving average or autoregressive structures. The efficiency of dynamic CA was estimated from the model's derived CBFV response to a step change in BP as an autoregulation index for both linear and non-linear models. Non-linear models with recurrences (autoregressive) showed the best results, with CA indexes of 5.9 ± 1.5 in normocapnia, and 2.5 ± 1.2 for hypercapnia with an area under the receiver-operator curve of 0.955. The high performance achieved by non-linear SVM models to detect deterioration of dynamic CA should encourage further assessment of its applicability to clinical conditions where CA might be impaired.

  18. Stability of faults with heterogeneous friction properties and effective normal stress

    NASA Astrophysics Data System (ADS)

    Luo, Yingdi; Ampuero, Jean-Paul

    2018-05-01

    Abundant geological, seismological and experimental evidence of the heterogeneous structure of natural faults motivates the theoretical and computational study of the mechanical behavior of heterogeneous frictional fault interfaces. Fault zones are composed of a mixture of materials with contrasting strength, which may affect the spatial variability of seismic coupling, the location of high-frequency radiation and the diversity of slip behavior observed in natural faults. To develop a quantitative understanding of the effect of strength heterogeneity on the mechanical behavior of faults, here we investigate a fault model with spatially variable frictional properties and pore pressure. Conceptually, this model may correspond to two rough surfaces in contact along discrete asperities, the space in between being filled by compressed gouge. The asperities have different permeability than the gouge matrix and may be hydraulically sealed, resulting in different pore pressure. We consider faults governed by rate-and-state friction, with mixtures of velocity-weakening and velocity-strengthening materials and contrasts of effective normal stress. We systematically study the diversity of slip behaviors generated by this model through multi-cycle simulations and linear stability analysis. The fault can be either stable without spontaneous slip transients, or unstable with spontaneous rupture. When the fault is unstable, slip can rupture either part or the entire fault. In some cases the fault alternates between these behaviors throughout multiple cycles. We determine how the fault behavior is controlled by the proportion of velocity-weakening and velocity-strengthening materials, their relative strength and other frictional properties. We also develop, through heuristic approximations, closed-form equations to predict the stability of slip on heterogeneous faults. Our study shows that a fault model with heterogeneous materials and pore pressure contrasts is a viable framework to reproduce the full spectrum of fault behaviors observed in natural faults: from fast earthquakes, to slow transients, to stable sliding. In particular, this model constitutes a building block for models of episodic tremor and slow slip events.

  19. [Changes in clinical standards and the need for adjusting legal standards of care from the point of view of civil law].

    PubMed

    Rosenberger, Rainer

    2007-01-01

    The legal standard of medical care is laid down in Sect. 276 of the German Civil Code (principle of due diligence). It applies to both contractual and tortious liability and likewise to the treatment of patients insured under the statutory health insurance scheme and self-payers. The legal standard of care conforms to the clinical standards because medical liability means medical professional liability. Liability law does not distinguish between different standards of care in the treatment of patients insured under the statutory health insurance scheme on the one hand and privately insured patients on the other. Changes in clinical standards immediately affect liability law without the need for formal adaptation of the legal standard of care. Liability law cannot claim more diligence than that owed from a medical point of view. Legislative changes that result in a lowering of medical standards (reduction in the quality of treatment) will have to be accepted by liability law, even if these are regulations pertaining to Social Law (SGB V, Book 5 of the German Social Code). In this respect, the principle of legal unity applies. In consideration of this kind of changes the due diligence requirements for the treatment of patients insured under the statutory health insurance scheme and privately insured patients remain basically equal. If these changes lead to an increase of risk for the patient, the resulting liabilities are not to be attributed to the therapist. What remains to be seen is whether there will be an increased attempt to minimise risk by "additionally purchasing health care services".

  20. Detecting Mixtures from Structural Model Differences Using Latent Variable Mixture Modeling: A Comparison of Relative Model Fit Statistics

    ERIC Educational Resources Information Center

    Henson, James M.; Reise, Steven P.; Kim, Kevin H.

    2007-01-01

    The accuracy of structural model parameter estimates in latent variable mixture modeling was explored with a 3 (sample size) [times] 3 (exogenous latent mean difference) [times] 3 (endogenous latent mean difference) [times] 3 (correlation between factors) [times] 3 (mixture proportions) factorial design. In addition, the efficacy of several…

  1. 12 CFR 1008.309 - Absence of liability for good-faith administration.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Absence of liability for good-faith... for Administration of the NMLSR § 1008.309 Absence of liability for good-faith administration. The... action or proceeding for monetary damages by reason of the good-faith action or omission of any officer...

  2. 12 CFR 1008.309 - Absence of liability for good-faith administration.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Absence of liability for good-faith... for Administration of the NMLSR § 1008.309 Absence of liability for good-faith administration. The... action or proceeding for monetary damages by reason of the good-faith action or omission of any officer...

  3. 12 CFR 1008.309 - Absence of liability for good-faith administration.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Absence of liability for good-faith... for Administration of the NMLSR § 1008.309 Absence of liability for good-faith administration. The... action or proceeding for monetary damages by reason of the good-faith action or omission of any officer...

  4. 27 CFR 26.193 - Notification of tax liability.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Notification of tax liability. 26.193 Section 26.193 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE... Rico § 26.193 Notification of tax liability. (a) If the chemist of the Treasury of Puerto Rico finds...

  5. 43 CFR 3733.2 - Liability of United States.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Liability of United States. 3733.2 Section... WITHDRAWALS: GENERAL Risk of Operation § 3733.2 Liability of United States. The Act in section 3 provides in part as follows: Provided, That the United States, its permittees and licensees shall not be...

  6. 43 CFR 3733.2 - Liability of United States.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Liability of United States. 3733.2 Section... WITHDRAWALS: GENERAL Risk of Operation § 3733.2 Liability of United States. The Act in section 3 provides in part as follows: Provided, That the United States, its permittees and licensees shall not be...

  7. 43 CFR 3733.2 - Liability of United States.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Liability of United States. 3733.2 Section... WITHDRAWALS: GENERAL Risk of Operation § 3733.2 Liability of United States. The Act in section 3 provides in part as follows: Provided, That the United States, its permittees and licensees shall not be...

  8. 43 CFR 3733.2 - Liability of United States.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Liability of United States. 3733.2 Section... WITHDRAWALS: GENERAL Risk of Operation § 3733.2 Liability of United States. The Act in section 3 provides in part as follows: Provided, That the United States, its permittees and licensees shall not be...

  9. 48 CFR 1852.228-82 - Insurance-Total Immunity From Tort Liability.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Insurance-Total Immunity... Provisions and Clauses 1852.228-82 Insurance—Total Immunity From Tort Liability. As prescribed in 1828.311-270(d), insert the following clause: Insurance—Total Immunity From Tort Liability (SEP 2000) (a) The...

  10. 48 CFR 1852.228-82 - Insurance-Total Immunity From Tort Liability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Insurance-Total Immunity... Provisions and Clauses 1852.228-82 Insurance—Total Immunity From Tort Liability. As prescribed in 1828.311-270(d), insert the following clause: Insurance—Total Immunity From Tort Liability (SEP 2000) (a) The...

  11. 48 CFR 1852.228-82 - Insurance-Total Immunity From Tort Liability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Insurance-Total Immunity... Provisions and Clauses 1852.228-82 Insurance—Total Immunity From Tort Liability. As prescribed in 1828.311-270(d), insert the following clause: Insurance—Total Immunity From Tort Liability (SEP 2000) (a) The...

  12. 48 CFR 1852.228-82 - Insurance-Total Immunity From Tort Liability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Insurance-Total Immunity... Provisions and Clauses 1852.228-82 Insurance—Total Immunity From Tort Liability. As prescribed in 1828.311-270(d), insert the following clause: Insurance—Total Immunity From Tort Liability (SEP 2000) (a) The...

  13. 48 CFR 1852.228-82 - Insurance-Total Immunity From Tort Liability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Insurance-Total Immunity... Provisions and Clauses 1852.228-82 Insurance—Total Immunity From Tort Liability. As prescribed in 1828.311-270(d), insert the following clause: Insurance—Total Immunity From Tort Liability (SEP 2000) (a) The...

  14. 16 CFR 802.10 - Stock dividends and splits; reorganizations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of corporation C. C is converted to a limited liability company in which A holds 60% and B holds 40% of the membership interests. No new assets are contributed. The conversion to a limited liability... holds 45% in the new limited liability company, the conversion is not exempt for B and may require...

  15. 7 CFR 4290.160 - Special rules for Partnership RBICs and LLC RBICs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... general partner of a Partnership RBIC which is a corporation, limited liability company or partnership (an “Entity General Partner”), or a managing member of an LLC RBIC which is a corporation, limited liability... corporation, operating agreement if a limited liability company, or partnership agreement if a partnership. (3...

  16. 31 CFR 50.92 - Determination of pro rata share.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INSURANCE PROGRAM Cap on Annual Liability § 50.92 Determination of pro rata share. (a) Pro rata loss... providing property and casualty insurance under the Program if there were no cap on annual liability under... estimates that aggregate insured losses may exceed the cap on annual liability for a Program Year, then...

  17. 22 CFR 211.9 - Liability for loss damage or improper distribution of commodities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... specify how such losses occurred; (E) Obtain copies of port and/or ship records including scale weights... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Liability for loss damage or improper... § 211.9 Liability for loss damage or improper distribution of commodities. (Where the instructions in...

  18. 22 CFR 211.9 - Liability for loss damage or improper distribution of commodities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... specify how such losses occurred; (E) Obtain copies of port and/or ship records including scale weights... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Liability for loss damage or improper... § 211.9 Liability for loss damage or improper distribution of commodities. (Where the instructions in...

  19. 29 CFR 4211.36 - Modifications to the determination of initial liabilities, the amortization of initial...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., the amortization of initial liabilities, and the allocation fraction. 4211.36 Section 4211.36 Labor... initial liabilities, and the allocation fraction. (a) General rule. A plan using any of the allocation... participation under their prior plans. An amendment under this paragraph must include an allocation fraction...

  20. 26 CFR 1.404(g)-1 - Deduction of employer liability payments.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...(g)-1 Section 1.404(g)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY.... § 1.404(g)-1 Deduction of employer liability payments. (a) General rule. Employer liability payments... deductible under section 404(g) and this section only if the payment satisfies the conditions of section 162...

Top