Sample records for analysis assumptions add

  1. Weighting-Based Sensitivity Analysis in Causal Mediation Studies

    ERIC Educational Resources Information Center

    Hong, Guanglei; Qin, Xu; Yang, Fan

    2018-01-01

    Through a sensitivity analysis, the analyst attempts to determine whether a conclusion of causal inference could be easily reversed by a plausible violation of an identification assumption. Analytic conclusions that are harder to alter by such a violation are expected to add a higher value to scientific knowledge about causality. This article…

  2. Marking and Moderation in the UK: False Assumptions and Wasted Resources

    ERIC Educational Resources Information Center

    Bloxham, Sue

    2009-01-01

    This article challenges a number of assumptions underlying marking of student work in British universities. It argues that, in developing rigorous moderation procedures, we have created a huge burden for markers which adds little to accuracy and reliability but creates additional work for staff, constrains assessment choices and slows down…

  3. Lesbian health and the assumption of heterosexuality: an organizational perspective.

    PubMed

    Daley, Andrea

    2003-01-01

    This study used a qualitative research design to explore hospital policies and practices and the assumption of female heterosexuality. The assumption of heterosexuality is a product of discursive practices that normalize heterosexuality and individualize lesbian sexual identities. Literature indicates that the assumption of female heterosexuality is implicated in both the invisibility and marked visibility of lesbians as service users. This research adds to existing literature by shifting the focus of study from individual to organizational practices and, in so doing, seeks to uncover hidden truths, explore the functional power of language, and allow for the discovery of what we know and--equally as important--how we know.

  4. Challenges to understanding spatial patterns of disease: philosophical alternatives to logical positivism.

    PubMed

    Mayer, J D

    1992-08-01

    Most studies of disease distribution, in medical geography and other related disciplines, have been empirical in nature and rooted in the assumptions of logical positivism. However, some of the more newly articulated philosophies of the social sciences, and of social theory, have much to add in the understanding of the processes and mechanisms underlying disease distribution. This paper represents a plea for creative synthesis between logical positivism and realism or structuration, and uses specific examples to suggest how disease distribution, as a surface phenomenon, can be explained using deeper analysis.

  5. Network Structure and Biased Variance Estimation in Respondent Driven Sampling

    PubMed Central

    Verdery, Ashton M.; Mouw, Ted; Bauldry, Shawn; Mucha, Peter J.

    2015-01-01

    This paper explores bias in the estimation of sampling variance in Respondent Driven Sampling (RDS). Prior methodological work on RDS has focused on its problematic assumptions and the biases and inefficiencies of its estimators of the population mean. Nonetheless, researchers have given only slight attention to the topic of estimating sampling variance in RDS, despite the importance of variance estimation for the construction of confidence intervals and hypothesis tests. In this paper, we show that the estimators of RDS sampling variance rely on a critical assumption that the network is First Order Markov (FOM) with respect to the dependent variable of interest. We demonstrate, through intuitive examples, mathematical generalizations, and computational experiments that current RDS variance estimators will always underestimate the population sampling variance of RDS in empirical networks that do not conform to the FOM assumption. Analysis of 215 observed university and school networks from Facebook and Add Health indicates that the FOM assumption is violated in every empirical network we analyze, and that these violations lead to substantially biased RDS estimators of sampling variance. We propose and test two alternative variance estimators that show some promise for reducing biases, but which also illustrate the limits of estimating sampling variance with only partial information on the underlying population social network. PMID:26679927

  6. 10 CFR 611.101 - Application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., including vehicle simulations using industry standard model (need to add name and location of this open.... All such information and data must include assumptions made in their preparation and the range of... any product (vehicle or component) to be produced by or through the project, including relevant data...

  7. Visiting Filmmakers: Why Bother?

    ERIC Educational Resources Information Center

    MacDonald, Scott

    1995-01-01

    Argues that visits by independent filmmakers to campus are exciting and intellectually invigorating for students and teachers, and these visits add to the cultural energy of the college. Notes that a commitment to independent cinema challenges the assumptions and the economics of conventional cinema. Discusses how much independent filmmakers are…

  8. Economics in "Global Health 2035": a sensitivity analysis of the value of a life year estimates.

    PubMed

    Chang, Angela Y; Robinson, Lisa A; Hammitt, James K; Resch, Stephen C

    2017-06-01

    In "Global health 2035: a world converging within a generation," The Lancet Commission on Investing in Health (CIH) adds the value of increased life expectancy to the value of growth in gross domestic product (GDP) when assessing national well-being. To value changes in life expectancy, the CIH relies on several strong assumptions to bridge gaps in the empirical research. It finds that the value of a life year (VLY) averages 2.3 times GDP per capita for low- and middle-income countries (LMICs) assuming the changes in life expectancy they experienced from 2000 to 2011 are permanent. The CIH VLY estimate is based on a specific shift in population life expectancy and includes a 50 percent reduction for children ages 0 through 4. We investigate the sensitivity of this estimate to the underlying assumptions, including the effects of income, age, and life expectancy, and the sequencing of the calculations. We find that reasonable alternative assumptions regarding the effects of income, age, and life expectancy may reduce the VLY estimates to 0.2 to 2.1 times GDP per capita for LMICs. Removing the reduction for young children increases the VLY, while reversing the sequencing of the calculations reduces the VLY. Because the VLY is sensitive to the underlying assumptions, analysts interested in applying this approach elsewhere must tailor the estimates to the impacts of the intervention and the characteristics of the affected population. Analysts should test the sensitivity of their conclusions to reasonable alternative assumptions. More work is needed to investigate options for improving the approach.

  9. Data Transformations for Inference with Linear Regression: Clarifications and Recommendations

    ERIC Educational Resources Information Center

    Pek, Jolynn; Wong, Octavia; Wong, C. M.

    2017-01-01

    Data transformations have been promoted as a popular and easy-to-implement remedy to address the assumption of normally distributed errors (in the population) in linear regression. However, the application of data transformations introduces non-ignorable complexities which should be fully appreciated before their implementation. This paper adds to…

  10. Dissecting effects of complex mixtures: who's afraid of informative priors?

    PubMed

    Thomas, Duncan C; Witte, John S; Greenland, Sander

    2007-03-01

    Epidemiologic studies commonly investigate multiple correlated exposures, which are difficult to analyze appropriately. Hierarchical modeling provides a promising approach for analyzing such data by adding a higher-level structure or prior model for the exposure effects. This prior model can incorporate additional information on similarities among the correlated exposures and can be parametric, semiparametric, or nonparametric. We discuss the implications of applying these models and argue for their expanded use in epidemiology. While a prior model adds assumptions to the conventional (first-stage) model, all statistical methods (including conventional methods) make strong intrinsic assumptions about the processes that generated the data. One should thus balance prior modeling assumptions against assumptions of validity, and use sensitivity analyses to understand their implications. In doing so - and by directly incorporating into our analyses information from other studies or allied fields - we can improve our ability to distinguish true causes of disease from noise and bias.

  11. Taking evolution seriously in political science.

    PubMed

    Lewis, Orion; Steinmo, Sven

    2010-09-01

    In this essay, we explore the epistemological and ontological assumptions that have been made to make political science "scientific." We show how political science has generally adopted an ontologically reductionist philosophy of science derived from Newtonian physics and mechanics. This mechanical framework has encountered problems and constraints on its explanatory power, because an emphasis on equilibrium analysis is ill-suited for the study of political change. We outline the primary differences between an evolutionary ontology of social science and the physics-based philosophy commonly employed. Finally, we show how evolutionary thinking adds insight into the study of political phenomena and research questions that are of central importance to the field, such as preference formation.

  12. The Effects of Affect on Study Abroad Students

    ERIC Educational Resources Information Center

    Savicki, Victor

    2013-01-01

    Being a study abroad student is not all sweetness and light. By definition, study abroad students are faced with acculturative stress (Berry, 2005) by virtue of encountering differences in assumptions, values, and expectations of daily living in their host culture. Add to that the usual challenge of hearing and speaking a different language, and…

  13. Interpretive Research Aiming at Theory Building: Adopting and Adapting the Case Study Design

    ERIC Educational Resources Information Center

    Diaz Andrade, Antonio

    2009-01-01

    Although the advantages of case study design are widely recognised, its original positivist underlying assumptions may mislead interpretive researchers aiming at theory building. The paper discusses the limitations of the case study design for theory building and explains how grounded theory systemic process adds to the case study design. The…

  14. A stochastic model of firm growth

    NASA Astrophysics Data System (ADS)

    Bottazzi, Giulio; Secchi, Angelo

    2003-06-01

    Recently from analyses on different databases the tent-shape of the distribution of firm growth rates has emerged as a robust and universal characteristic of the time evolution of corporates. We add new evidence on this topic and we present a new stochastic model that, under rather general assumptions, provides a robust explanation for the observed regularity.

  15. Economics in “Global Health 2035”: a sensitivity analysis of the value of a life year estimates

    PubMed Central

    Chang, Angela Y; Robinson, Lisa A; Hammitt, James K; Resch, Stephen C

    2017-01-01

    Background In “Global health 2035: a world converging within a generation,” The Lancet Commission on Investing in Health (CIH) adds the value of increased life expectancy to the value of growth in gross domestic product (GDP) when assessing national well–being. To value changes in life expectancy, the CIH relies on several strong assumptions to bridge gaps in the empirical research. It finds that the value of a life year (VLY) averages 2.3 times GDP per capita for low– and middle–income countries (LMICs) assuming the changes in life expectancy they experienced from 2000 to 2011 are permanent. Methods The CIH VLY estimate is based on a specific shift in population life expectancy and includes a 50 percent reduction for children ages 0 through 4. We investigate the sensitivity of this estimate to the underlying assumptions, including the effects of income, age, and life expectancy, and the sequencing of the calculations. Findings We find that reasonable alternative assumptions regarding the effects of income, age, and life expectancy may reduce the VLY estimates to 0.2 to 2.1 times GDP per capita for LMICs. Removing the reduction for young children increases the VLY, while reversing the sequencing of the calculations reduces the VLY. Conclusion Because the VLY is sensitive to the underlying assumptions, analysts interested in applying this approach elsewhere must tailor the estimates to the impacts of the intervention and the characteristics of the affected population. Analysts should test the sensitivity of their conclusions to reasonable alternative assumptions. More work is needed to investigate options for improving the approach. PMID:28400950

  16. Episodic Memory Does Not Add Up: Verbatim-Gist Superposition Predicts Violations of the Additive Law of Probability

    PubMed Central

    Brainerd, C. J.; Wang, Zheng; Reyna, Valerie. F.; Nakamura, K.

    2015-01-01

    Fuzzy-trace theory’s assumptions about memory representation are cognitive examples of the familiar superposition property of physical quantum systems. When those assumptions are implemented in a formal quantum model (QEMc), they predict that episodic memory will violate the additive law of probability: If memory is tested for a partition of an item’s possible episodic states, the individual probabilities of remembering the item as belonging to each state must sum to more than 1. We detected this phenomenon using two standard designs, item false memory and source false memory. The quantum implementation of fuzzy-trace theory also predicts that violations of the additive law will vary in strength as a function of reliance on gist memory. That prediction, too, was confirmed via a series of manipulations (e.g., semantic relatedness, testing delay) that are thought to increase gist reliance. Surprisingly, an analysis of the underlying structure of violations of the additive law revealed that as a general rule, increases in remembering correct episodic states do not produce commensurate reductions in remembering incorrect states. PMID:26236091

  17. Critique of analyses of natural gas pricing alternatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemon, R.

    The Administration has predicted that deregulation would add $210 billion to gas producers' profits over the next eight years; by contrast, a study done for the Natural Gas Supply Committee by Edward Erickson concludes that deregulation would mean a $126 billion savings to consumers over the same period. This article examines the analyses done in the past year by nine organizations. By examining the assumptions and projections of each analysis on wellhead prices, gas supplies, retail gas prices, and alternative energy costs and mixes, an attempt is made to explain divergent projections of the costs of energy under the threemore » alternative natural-gas-pricing scenarios: continuance under FPC's Opinion 770-A; National Energy Plan (NEP); and deregulation of new gas.« less

  18. Indirect Comparisons: A Review of Reporting and Methodological Quality

    PubMed Central

    Donegan, Sarah; Williamson, Paula; Gamble, Carrol; Tudur-Smith, Catrin

    2010-01-01

    Background The indirect comparison of two interventions can be valuable in many situations. However, the quality of an indirect comparison will depend on several factors including the chosen methodology and validity of underlying assumptions. Published indirect comparisons are increasingly more common in the medical literature, but as yet, there are no published recommendations of how they should be reported. Our aim is to systematically review the quality of published indirect comparisons to add to existing empirical data suggesting that improvements can be made when reporting and applying indirect comparisons. Methodology/Findings Reviews applying statistical methods to indirectly compare the clinical effectiveness of two interventions using randomised controlled trials were eligible. We searched (1966–2008) Database of Abstracts and Reviews of Effects, The Cochrane library, and Medline. Full review publications were assessed for eligibility. Specific criteria to assess quality were developed and applied. Forty-three reviews were included. Adequate methodology was used to calculate the indirect comparison in 41 reviews. Nineteen reviews assessed the similarity assumption using sensitivity analysis, subgroup analysis, or meta-regression. Eleven reviews compared trial-level characteristics. Twenty-four reviews assessed statistical homogeneity. Twelve reviews investigated causes of heterogeneity. Seventeen reviews included direct and indirect evidence for the same comparison; six reviews assessed consistency. One review combined both evidence types. Twenty-five reviews urged caution in interpretation of results, and 24 reviews indicated when results were from indirect evidence by stating this term with the result. Conclusions This review shows that the underlying assumptions are not routinely explored or reported when undertaking indirect comparisons. We recommend, therefore, that the quality of indirect comparisons should be improved, in particular, by assessing assumptions and reporting the assessment methods applied. We propose that the quality criteria applied in this article may provide a basis to help review authors carry out indirect comparisons and to aid appropriate interpretation. PMID:21085712

  19. Flux Jacobian Matrices For Equilibrium Real Gases

    NASA Technical Reports Server (NTRS)

    Vinokur, Marcel

    1990-01-01

    Improved formulation includes generalized Roe average and extension to three dimensions. Flux Jacobian matrices derived for use in numerical solutions of conservation-law differential equations of inviscid flows of ideal gases extended to real gases. Real-gas formulation of these matrices retains simplifying assumptions of thermodynamic and chemical equilibrium, but adds effects of vibrational excitation, dissociation, and ionization of gas molecules via general equation of state.

  20. Approach to the unfolding and folding dynamics of add A-riboswitch upon adenine dissociation using a coarse-grained elastic network model

    NASA Astrophysics Data System (ADS)

    Li, Chunhua; Lv, Dashuai; Zhang, Lei; Yang, Feng; Wang, Cunxin; Su, Jiguo; Zhang, Yang

    2016-07-01

    Riboswitches are noncoding mRNA segments that can regulate the gene expression via altering their structures in response to specific metabolite binding. We proposed a coarse-grained Gaussian network model (GNM) to examine the unfolding and folding dynamics of adenosine deaminase (add) A-riboswitch upon the adenine dissociation, in which the RNA is modeled by a nucleotide chain with interaction networks formed by connecting adjoining atomic contacts. It was shown that the adenine binding is critical to the folding of the add A-riboswitch while the removal of the ligand can result in drastic increase of the thermodynamic fluctuations especially in the junction regions between helix domains. Under the assumption that the native contacts with the highest thermodynamic fluctuations break first, the iterative GNM simulations showed that the unfolding process of the adenine-free add A-riboswitch starts with the denature of the terminal helix stem, followed by the loops and junctions involving ligand binding pocket, and then the central helix domains. Despite the simplified coarse-grained modeling, the unfolding dynamics and pathways are shown in close agreement with the results from atomic-level MD simulations and the NMR and single-molecule force spectroscopy experiments. Overall, the study demonstrates a new avenue to investigate the binding and folding dynamics of add A-riboswitch molecule which can be readily extended for other RNA molecules.

  1. Linear-No-Threshold Default Assumptions for Noncancer and Nongenotoxic Cancer Risks: A Mathematical and Biological Critique.

    PubMed

    Bogen, Kenneth T

    2016-03-01

    To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically. © 2015 The Author. Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  2. Rayleigh imaging in spectral mammography

    NASA Astrophysics Data System (ADS)

    Berggren, Karl; Danielsson, Mats; Fredenberg, Erik

    2016-03-01

    Spectral imaging is the acquisition of multiple images of an object at different energy spectra. In mammography, dual-energy imaging (spectral imaging with two energy levels) has been investigated for several applications, in particular material decomposition, which allows for quantitative analysis of breast composition and quantitative contrast-enhanced imaging. Material decomposition with dual-energy imaging is based on the assumption that there are two dominant photon interaction effects that determine linear attenuation: the photoelectric effect and Compton scattering. This assumption limits the number of basis materials, i.e. the number of materials that are possible to differentiate between, to two. However, Rayleigh scattering may account for more than 10% of the linear attenuation in the mammography energy range. In this work, we show that a modified version of a scanning multi-slit spectral photon-counting mammography system is able to acquire three images at different spectra and can be used for triple-energy imaging. We further show that triple-energy imaging in combination with the efficient scatter rejection of the system enables measurement of Rayleigh scattering, which adds an additional energy dependency to the linear attenuation and enables material decomposition with three basis materials. Three available basis materials have the potential to improve virtually all applications of spectral imaging.

  3. Generalized Linear Covariance Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chunhua; Department of Computational Medicine and Bioinformatics, University of Michigan, Ann Arbor, Michigan 45108; Lv, Dashuai

    Riboswitches are noncoding mRNA segments that can regulate the gene expression via altering their structures in response to specific metabolite binding. We proposed a coarse-grained Gaussian network model (GNM) to examine the unfolding and folding dynamics of adenosine deaminase (add) A-riboswitch upon the adenine dissociation, in which the RNA is modeled by a nucleotide chain with interaction networks formed by connecting adjoining atomic contacts. It was shown that the adenine binding is critical to the folding of the add A-riboswitch while the removal of the ligand can result in drastic increase of the thermodynamic fluctuations especially in the junction regionsmore » between helix domains. Under the assumption that the native contacts with the highest thermodynamic fluctuations break first, the iterative GNM simulations showed that the unfolding process of the adenine-free add A-riboswitch starts with the denature of the terminal helix stem, followed by the loops and junctions involving ligand binding pocket, and then the central helix domains. Despite the simplified coarse-grained modeling, the unfolding dynamics and pathways are shown in close agreement with the results from atomic-level MD simulations and the NMR and single-molecule force spectroscopy experiments. Overall, the study demonstrates a new avenue to investigate the binding and folding dynamics of add A-riboswitch molecule which can be readily extended for other RNA molecules.« less

  5. [Digital learning object for diagnostic reasoning in nursing applied to the integumentary system].

    PubMed

    da Costa, Cecília Passos Vaz; Luz, Maria Helena Barros Araújo

    2015-12-01

    To describe the creation of a digital learning object for diagnostic reasoning in nursing applied to the integumentary system at a public university of Piaui. A methodological study applied to technological production based on the pedagogical framework of problem-based learning. The methodology for creating the learning object observed the stages of analysis, design, development, implementation and evaluation recommended for contextualized instructional design. The revised taxonomy of Bloom was used to list the educational goals. The four modules of the developed learning object were inserted into the educational platform Moodle. The theoretical assumptions allowed the design of an important online resource that promotes effective learning in the scope of nursing education. This study should add value to nursing teaching practices through the use of digital learning objects for teaching diagnostic reasoning applied to skin and skin appendages.

  6. The influence of package size and flute type of corrugated boxes on load bridging in unit loads

    Treesearch

    Jonghun Park; Laszlo Horvath; Marshall S. White; Samantha Phanthanousy; Philip Araman; Robert J. Bush

    2017-01-01

    Shipping pallets often are designed with the assumption that the payload carried is flexible and uniformly distributed on the pallet surface. However, packages on the pallet can act as a series of discrete loads, and the physical interactions among the packages can add stiffness to the pallet/load combination. The term ‘load bridging’ has been used to describe this...

  7. How would you decide? Helping geoscience students consider ethical dimensions in a gescience context

    NASA Astrophysics Data System (ADS)

    Bank, C. G.; Ryan, A. M.

    2017-12-01

    This presentation shows an example of infusing ethics into geoscience teaching, and a preliminary analysis of student answers to an exam question to establish whether this example can be used in an effective way. We presented a case study on floods in two distribution geoscience courses, and provided students with criteria to come to an ethical decision. One course was taught in winter 2016 and the other in summer 2016 with a total of 358 students. Pre- and post-questionnaires allow only limited conclusions because just 33 students answered both. In the exam we asked students if they would evacuate a small aboriginal settlement to prevent flooding in a large city. We coded their answers according to the criteria (stakeholders, contributions by geoscientists, alternative options, and assumptions) they were provided in class. While students did well listing stakeholders and recalling contributions by geoscientists they struggled to provide alternative options. Still, many of them verbalized assumptions inherent in their thoughts and nearly half of students recognized that this is a complex problem. We posit that a case study is a valid way to encourage students to link ethics to a geoscience issue, and propose that our framework may empower geoscience educators who do not necessarily feel comfortable teaching ethics to add this element to their teaching toolkit.

  8. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers

    PubMed Central

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-01-01

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653

  9. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.

    PubMed

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-06-29

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.

  10. Mismeasurement and the resonance of strong confounders: correlated errors.

    PubMed

    Marshall, J R; Hastrup, J L; Ross, J S

    1999-07-01

    Confounding in epidemiology, and the limits of standard methods of control for an imperfectly measured confounder, have been understood for some time. However, most treatments of this problem are based on the assumption that errors of measurement in confounding and confounded variables are independent. This paper considers the situation in which a strong risk factor (confounder) and an inconsequential but suspected risk factor (confounded) are each measured with errors that are correlated; the situation appears especially likely to occur in the field of nutritional epidemiology. Error correlation appears to add little to measurement error as a source of bias in estimating the impact of a strong risk factor: it can add to, diminish, or reverse the bias induced by measurement error in estimating the impact of the inconsequential risk factor. Correlation of measurement errors can add to the difficulty involved in evaluating structures in which confounding and measurement error are present. In its presence, observed correlations among risk factors can be greater than, less than, or even opposite to the true correlations. Interpretation of multivariate epidemiologic structures in which confounding is likely requires evaluation of measurement error structures, including correlations among measurement errors.

  11. Evaluation of Quality Parameters in Water Resource Planning. (A State-of-the-Art Survey of the Economics of Water Quality)

    DTIC Science & Technology

    1974-12-01

    and bacteria is insufficiently known. Coliform counts are relied upon to warn of possible harm, on the assumption that pathogenic species are present...BOD, not nitrogen. Urbanization, harvesting of trees , and paving hasten the movement of nitrates from land to water. Infants are particularly...clearing, highway and building construction, adds silt and sand, gravel and rocks, branches and trees , to the natural load of sediment in rivers and lakes

  12. Safety and tolerability of dapagliflozin, saxagliptin and metformin in combination: Post-hoc analysis of concomitant add-on versus sequential add-on to metformin and of triple versus dual therapy with metformin.

    PubMed

    Del Prato, Stefano; Rosenstock, Julio; Garcia-Sanchez, Ricardo; Iqbal, Nayyar; Hansen, Lars; Johnsson, Eva; Chen, Hungta; Mathieu, Chantal

    2018-06-01

    The safety of triple oral therapy with dapagliflozin plus saxagliptin plus metformin versus dual therapy with dapagliflozin or saxagliptin plus metformin was compared in a post-hoc analysis of 3 randomized trials of sequential or concomitant add-on of dapagliflozin and saxagliptin to metformin. In the concomitant add-on trial, patients with type 2 diabetes on stable metformin received dapagliflozin 10 mg/d plus saxagliptin 5 mg/d. In sequential add-on trials, patients on metformin plus either saxagliptin 5 mg/d or dapagliflozin 10 mg/d received dapagliflozin 10 mg/d or saxagliptin 5 mg/d, respectively, as add-on therapy. After 24 weeks, incidences of adverse events and serious adverse events were similar between triple and dual therapy and between concomitant and sequential add-on regimens. Urinary tract infections were more common with sequential than with concomitant add-on therapy; genital infections were reported only with sequential add-on of dapagliflozin to saxagliptin plus metformin. Hypoglycaemia incidence was <2.0% across all analysis groups. In conclusion, the safety and tolerability of triple therapy with dapagliflozin, saxagliptin and metformin, as either concomitant or sequential add-on, were similar to dual therapy with either agent added to metformin. © 2018 The Authors. Diabetes, Obesity and Metabolism published by John Wiley & Sons Ltd.

  13. Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon

    2013-01-01

    This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.

  14. Significant treatment effect of add-on ketamine anesthesia in electroconvulsive therapy in depressive patients: A meta-analysis.

    PubMed

    Li, Dian-Jeng; Wang, Fu-Chiang; Chu, Che-Sheng; Chen, Tien-Yu; Tang, Chia-Hung; Yang, Wei-Cheng; Chow, Philip Chik-Keung; Wu, Ching-Kuan; Tseng, Ping-Tao; Lin, Pao-Yen

    2017-01-01

    Add-on ketamine anesthesia in electroconvulsive therapy (ECT) has been studied in depressive patients in several clinical trials with inconclusive findings. Two most recent meta-analyses reported insignificant findings with regards to the treatment effect of add-on ketamine anesthesia in ECT in depressive patients. The aim of this study is to update the current evidence and investigate the role of add-on ketamine anesthesia in ECT in depressive patients via a systematic review and meta-analysis. We performed a thorough literature search of the PubMed and ScienceDirect databases, and extracted all relevant clinical variables to compare the antidepressive outcomes between add-on ketamine anesthesia and other anesthetics in ECT. Total 16 articles with 346 patients receiving add-on ketamine anesthesia in ECT and 329 controls were recruited. We found that the antidepressive treatment effect of add-on ketamine anesthesia in ECT in depressive patients was significantly higher than that of other anesthetics (p<0.001). This significance persisted in both short-term (1-2 weeks) and moderate-term (3-4 weeks) treatment courses (all p<0.05). However, the side effect profiles and recovery time profiles were significantly worse in add-on ketamine anesthesia group than in control group. Our meta-analysis highlights the significantly higher antidepressive treatment effect of add-on ketamine in depressive patients receiving ECT compared to other anesthetics. However, clinicians need to take undesirable side effects into consideration when using add-on ketamine anesthesia in ECT in depressive patients. Copyright © 2016 Elsevier B.V. and ECNP. All rights reserved.

  15. --No Title--

    Science.gov Websites

    using mesonet visbility observations and CLARUS QC'd obs; Add ceiling height and sky cover analysis to precipitation coverage gaps near CONUS coastlines; Add significant wave height analysis to OCONUS domains

  16. Predicting the Cost per Flying Hour for the F-16 Using Programmatic and Operational Variables

    DTIC Science & Technology

    2005-06-01

    constant variance assumption is accomplished using the Breusch - Pagan test . This is accomplished and the results are listed in Table 12. Figures 19...and 20 follow and add to the discussion by plotting the residuals by predicted for both models. 52 Table 12: Breusch - Pagan Constant Variance Test ...Model A 13844455 6.97E+11 5 74 9.96 0.0764 Model B 74954796 8.69E+12 5 151 17.63 0.00344 Breusch - Pagan Test for Constant Variance -1000 -500 0 500

  17. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  18. Cost-effectiveness of omalizumab add-on to standard-of-care therapy in patients with uncontrolled severe allergic asthma in a Brazilian healthcare setting.

    PubMed

    Suzuki, Cibele; Lopes da Silva, Nilceia; Kumar, Praveen; Pathak, Purnima; Ong, Siew Hwa

    2017-08-01

    Omalizumab add-on to standard-of-care therapy has proven to be efficacious in severe asthma patients for whom exacerbations cannot be controlled otherwise. Moreover, evidence from different healthcare settings suggests reduced healthcare resource utilization with omalizumab. Based on these findings, this study aimed to assess the cost-effectiveness of the addition of omalizumab to standard-of-care therapy in patients with uncontrolled severe allergic asthma in a Brazilian healthcare setting. A previously published Markov model was adapted using Brazil-specific unit costs to compare the costs and outcomes of the addition of omalizumab to standard-of-care therapy vs standard-of-care therapy alone. Model inputs were largely based on the eXpeRience study. Costs and health outcomes were calculated for lifetime-years and were annually discounted at 5%. Both one-way and probabilistic sensitivity analyses were performed. An additional cost of R$280,400 for 5.20 additional quality-adjusted life-years was estimated with the addition of omalizumab to standard-of-care therapy, resulting in an incremental cost-effectiveness ratio of R$53,890. One-way sensitivity analysis indicated that discount rates, standard-of-care therapy exacerbation rates, and exacerbation-related mortality rates had the largest impact on incremental cost-effectiveness ratios. Assumptions of lifetime treatment adherence and rate of future exacerbations, independent of previous events, might affect the findings. The lack of Brazilian patients in the eXpeRience study may affect the findings, although sample size and baseline characteristics suggest that the modeled population closely resembles Brazilian severe allergic asthma patients. Results indicate that omalizumab as an add-on therapy is more cost-effective than standard-of-care therapy alone for Brazilian patients with uncontrolled severe allergic asthma, based on the World Health Organization's cost-effectiveness threshold of up to 3-times the gross domestic product.

  19. Annotating spatio-temporal datasets for meaningful analysis in the Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pebesma, Edzer; Scheider, Simon

    2014-05-01

    More and more environmental datasets that vary in space and time are available in the Web. This comes along with an advantage of using the data for other purposes than originally foreseen, but also with the danger that users may apply inappropriate analysis procedures due to lack of important assumptions made during the data collection process. In order to guide towards a meaningful (statistical) analysis of spatio-temporal datasets available in the Web, we have developed a Higher-Order-Logic formalism that captures some relevant assumptions in our previous work [1]. It allows to proof on meaningful spatial prediction and aggregation in a semi-automated fashion. In this poster presentation, we will present a concept for annotating spatio-temporal datasets available in the Web with concepts defined in our formalism. Therefore, we have defined a subset of the formalism as a Web Ontology Language (OWL) pattern. It allows capturing the distinction between the different spatio-temporal variable types, i.e. point patterns, fields, lattices and trajectories, that in turn determine whether a particular dataset can be interpolated or aggregated in a meaningful way using a certain procedure. The actual annotations that link spatio-temporal datasets with the concepts in the ontology pattern are provided as Linked Data. In order to allow data producers to add the annotations to their datasets, we have implemented a Web portal that uses a triple store at the backend to store the annotations and to make them available in the Linked Data cloud. Furthermore, we have implemented functions in the statistical environment R to retrieve the RDF annotations and, based on these annotations, to support a stronger typing of spatio-temporal datatypes guiding towards a meaningful analysis in R. [1] Stasch, C., Scheider, S., Pebesma, E., Kuhn, W. (2014): "Meaningful spatial prediction and aggregation", Environmental Modelling & Software, 51, 149-165.

  20. Seven-period asteroseismic fit of the Kepler DBV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischoff-Kim, Agnès; Østensen, Roy H.; Hermes, J. J.

    2014-10-10

    We present a new, better-constrained asteroseismic analysis of the helium-atmosphere (DB) white dwarf discovered in the field of view of the original Kepler mission. Observations obtained over the course of 2 yr yield at least seven independent modes, two more than were found in the discovery paper for the object. With several triplets and doublets, we are able to fix the ℓ and m identification of several modes before performing the fitting, greatly reducing the number of assumptions we must make about mode identification. We find a very thin helium layer for this relatively hot DB, which adds evidence tomore » the hypothesis that helium diffuses outward during DB cooling. At least a few of the modes appear to be stable on evolutionary timescales and could allow us to obtain a measurement of the rate of cooling with monitoring of the star over the course of the next few years with ground-based follow-up.« less

  1. Transmission Heterogeneity and Autoinoculation in a Multisite Infection Model of HPV

    PubMed Central

    Brouwer, Andrew F.; Meza, Rafael; Eisenberg, Marisa C.

    2015-01-01

    The human papillomavirus (HPV) is sexually transmitted and can infect oral, genital, and anal sites in the human epithelium. Here, we develop a multisite transmission model that includes autoinoculation, to study HPV and other multisite diseases. Under a homogeneous-contacts assumption, we analyze the basic reproduction number R0, as well as type and target reproduction numbers, for a two-site model. In particular, we find that R0 occupies a space between taking the maximum of next generation matrix terms for same site transmission and taking the geometric average of cross-site transmission terms in such a way that heterogeneity in the same-site transmission rates increases R0 while heterogeneity in the cross-site transmission decreases it. Additionally, autoinoculation adds considerable complexity to the form of R0. We extend this analysis to a heterosexual population, which additionally yields dynamics analogous to those of vector–host models. We also examine how these issues of heterogeneity may affect disease control, using type and target reproduction numbers. PMID:26518265

  2. Sensitivity analysis of add-on price estimate for select silicon wafering technologies

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1982-01-01

    The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.

  3. Scientific Data Analysis Toolkit: A Versatile Add-in to Microsoft Excel for Windows

    ERIC Educational Resources Information Center

    Halpern, Arthur M.; Frye, Stephen L.; Marzzacco, Charles J.

    2018-01-01

    Scientific Data Analysis Toolkit (SDAT) is a rigorous, versatile, and user-friendly data analysis add-in application for Microsoft Excel for Windows (PC). SDAT uses the familiar Excel environment to carry out most of the analytical tasks used in data analysis. It has been designed for student use in manipulating and analyzing data encountered in…

  4. "It Was the Best of Times, It Was the Worst of Times": A Qualitative Investigation of Perfectionism and Drinking Narratives in Undergraduate Students.

    PubMed

    Nealis, Logan J; Mackinnon, Sean P

    2017-01-01

    Perfectionism is a transdiagnostic risk factor for mental health and interpersonal difficulties, but research on perfectionism and alcohol use in emerging adults remains equivocal. Qualitative research methods are underutilized in this area, and inductive analysis of drinking narratives in undergraduate perfectionists may help clarify conflicting results and support novel approaches to quantitative inquiry in this area. We interviewed 20 undergraduates high in perfectionism (6 adaptive perfectionists and 14 maladaptive perfectionists) using a narrative interview, with analyses focusing on a situation involving alcohol use. We coded interviews for emergent themes using thematic analysis. Five themes emerged as follows: (1) drinking as a social experience, (2) suffering consequences, (3) learning from alcohol, (4) alcohol use as escapism, and (5) reluctance and moderation. Our results add to existing literature by highlighting the interpersonal conflict in perfectionistic people's experience in relation to alcohol use during emerging adulthood. Results also suggest perfectionistic people may use alcohol and intoxication as a way to facilitate a "release" from unpleasant situations or emotions. Perfectionists reported both positive and negative experiences, which lends support for using a narrative perspective to help overcome preexisting assumptions about adaptive and maladaptive qualities of perfectionism.

  5. A critique of the usefulness of inferential statistics in applied behavior analysis

    PubMed Central

    Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.

    1998-01-01

    Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304

  6. DNA content alterations in Tetrahymena pyriformis macronucleus after exposure to food preservatives sodium nitrate and sodium benzoate.

    PubMed

    Loutsidou, Ariadni C; Hatzi, Vasiliki I; Chasapis, C T; Terzoudi, Georgia I; Spiliopoulou, Chara A; Stefanidou, Maria E

    2012-12-01

    The toxicity, in terms of changes in the DNA content, of two food preservatives, sodium nitrate and sodium benzoate was studied on the protozoan Tetrahymena pyriformis using DNA image analysis technology. For this purpose, selected doses of both food additives were administered for 2 h to protozoa cultures and DNA image analysis of T. pyriformis nuclei was performed. The analysis was based on the measurement of the Mean Optical Density which represents the cellular DNA content. The results have shown that after exposure of the protozoan cultures to doses equivalent to ADI, a statistically significant increase in the macronuclear DNA content compared to the unexposed control samples was observed. The observed increase in the macronuclear DNA content is indicative of the stimulation of the mitotic process and the observed increase in MOD, accompanied by a stimulation of the protozoan proliferation activity is in consistence with this assumption. Since alterations at the DNA level such as DNA content and uncontrolled mitogenic stimulation have been linked with chemical carcinogenesis, the results of the present study add information on the toxicogenomic profile of the selected chemicals and may potentially lead to reconsideration of the excessive use of nitrates aiming to protect public health.

  7. 'You want to show you're a valuable employee': A critical discourse analysis of multi-perspective portrayals of employed women with fibromyalgia.

    PubMed

    Oldfield, Margaret; MacEachen, Ellen; MacNeill, Margaret; Kirsh, Bonnie

    2018-06-01

    Background Advice on fibromyalgia, a chronic illness primarily affecting women, often presents it as incompatible with work and rarely covers how to remain employed. Yet many women do. Objectives We aimed to understand how these women, their family members, and workmates portrayed employees with fibromyalgia, and how these portrayals helped women retain employment. Methods We interviewed 22 participants, comprising five triads and three dyads of people who knew each other. Using the methodology of critical discourse analysis, we analysed the interview data within and across the triads/dyads through coding, narrative summaries, and relational mapping. Results Participants reported stereotypes that employees with fibromyalgia are lazy, malingering, and less productive than healthy workers. Countering these assumptions, participants portrayed the women as normal, valuable employees who did not 'give in' to their illness. The portrayals drew on two discourses, normalcy and mind-controlling-the-body, and a related narrative, overcoming disability. We propose that participants' portrayals helped women manage their identities in competitive workplaces and thereby remain employed. Discussion Our findings augment the very sparse literature on employment with fibromyalgia. Using a new approach, critical discourse analysis, we expand on known job-retention strategies and add the perspectives of two key stakeholders: family members and workmates.

  8. Appalachian Basin Play Fairway Analysis: Thermal Quality Analysis in Low-Temperature Geothermal Play Fairway Analysis (GPFA-AB

    DOE Data Explorer

    Teresa E. Jordan

    2015-11-15

    This collection of files are part of a larger dataset uploaded in support of Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB, DOE Project DE-EE0006726). Phase 1 of the GPFA-AB project identified potential Geothermal Play Fairways within the Appalachian basin of Pennsylvania, West Virginia and New York. This was accomplished through analysis of 4 key criteria or ‘risks’: thermal quality, natural reservoir productivity, risk of seismicity, and heat utilization. Each of these analyses represent a distinct project task, with the fifth task encompassing combination of the 4 risks factors. Supporting data for all five tasks has been uploaded into the Geothermal Data Repository node of the National Geothermal Data System (NGDS). This submission comprises the data for Thermal Quality Analysis (project task 1) and includes all of the necessary shapefiles, rasters, datasets, code, and references to code repositories that were used to create the thermal resource and risk factor maps as part of the GPFA-AB project. The identified Geothermal Play Fairways are also provided with the larger dataset. Figures (.png) are provided as examples of the shapefiles and rasters. The regional standardized 1 square km grid used in the project is also provided as points (cell centers), polygons, and as a raster. Two ArcGIS toolboxes are available: 1) RegionalGridModels.tbx for creating resource and risk factor maps on the standardized grid, and 2) ThermalRiskFactorModels.tbx for use in making the thermal resource maps and cross sections. These toolboxes contain “item description” documentation for each model within the toolbox, and for the toolbox itself. This submission also contains three R scripts: 1) AddNewSeisFields.R to add seismic risk data to attribute tables of seismic risk, 2) StratifiedKrigingInterpolation.R for the interpolations used in the thermal resource analysis, and 3) LeaveOneOutCrossValidation.R for the cross validations used in the thermal interpolations. Some file descriptions make reference to various 'memos'. These are contained within the final report submitted October 16, 2015. Each zipped file in the submission contains an 'about' document describing the full Thermal Quality Analysis content available, along with key sources, authors, citation, use guidelines, and assumptions, with the specific file(s) contained within the .zip file highlighted.

  9. Error-Analysis for Correctness, Effectiveness, and Composing Procedure.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    The assumptions underpinning grammatical mistakes can often be detected by looking for patterns of errors in a student's work. Assumptions that negatively influence rhetorical effectiveness can similarly be detected through error analysis. On a smaller scale, error analysis can also reveal assumptions affecting rhetorical choice. Snags in the…

  10. Comparing published scientific journal articles to their pre-print versions

    DOE PAGES

    Klein, Martin; Broadwell, Peter; Farb, Sharon E.; ...

    2018-02-05

    Academic publishers claim that they add value to scholarly communications by coordinating reviews and contributing and enhancing text during publication. These contributions come at a considerable cost: US academic libraries paid $1.7 billion for serial subscriptions in 2008 alone. Library budgets, in contrast, are flat and not able to keep pace with serial price inflation. Here, we have investigated the publishers’ value proposition by conducting a comparative study of pre-print papers from two distinct science, technology, and medicine corpora and their final published counterparts. This comparison had two working assumptions: (1) If the publishers’ argument is valid, the text ofmore » a pre-print paper should vary measurably from its corresponding final published version, and (2) by applying standard similarity measures, we should be able to detect and quantify such differences. Our analysis revealed that the text contents of the scientific papers generally changed very little from their pre-print to final published versions. These findings contribute empirical indicators to discussions of the added value of commercial publishers and therefore should influence libraries’ economic decisions regarding access to scholarly publications.« less

  11. Cognitions about bodily purity attenuate stress perception.

    PubMed

    Kaspar, Kai; Cames, Sarah

    2016-12-09

    Based on the assumption that physical purity is associated with a clean slate impression, we examined how cognitions about bodily cleanliness modulate stress perception. Participants visualized themselves in a clean or dirty state before reporting the frequency of stress-related situations experienced in the past. In Study 1 (n = 519) and Study 2 (n = 647) cleanliness versus dirtiness cognitions reliably reduced stress perception. Further results and a mediation analysis revealed that this novel effect was not simply driven by participants' cognitive engagement in stress recall. Moreover, we found that participants' temporal engagement in the recall of past stressful events negatively correlated with the amount of perceived stress, indicating an ease-of-retrieval phenomenon. However, a direct manipulation of the number of recalled stressful events in Study 3 (n = 792) showed the opposite effect: few versus many recalled events increased the perceived frequency of past stress-related situations. Overall, these novel results indicate an interesting avenue for future research on cognitively oriented stress reduction interventions, add to the literature on purity-related clean slate effects, and may help to better understand washing rituals in patients with obsessive-compulsive disorders.

  12. Comparing published scientific journal articles to their pre-print versions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Martin; Broadwell, Peter; Farb, Sharon E.

    Academic publishers claim that they add value to scholarly communications by coordinating reviews and contributing and enhancing text during publication. These contributions come at a considerable cost: US academic libraries paid $1.7 billion for serial subscriptions in 2008 alone. Library budgets, in contrast, are flat and not able to keep pace with serial price inflation. Here, we have investigated the publishers’ value proposition by conducting a comparative study of pre-print papers from two distinct science, technology, and medicine corpora and their final published counterparts. This comparison had two working assumptions: (1) If the publishers’ argument is valid, the text ofmore » a pre-print paper should vary measurably from its corresponding final published version, and (2) by applying standard similarity measures, we should be able to detect and quantify such differences. Our analysis revealed that the text contents of the scientific papers generally changed very little from their pre-print to final published versions. These findings contribute empirical indicators to discussions of the added value of commercial publishers and therefore should influence libraries’ economic decisions regarding access to scholarly publications.« less

  13. Fine Scale Modeling and Forecasts of Upper Atmospheric Turbulence for Operational Use

    DTIC Science & Technology

    2014-11-30

    Weather Center Digital Data Service (ADDS) fhttp://www.aviationweather.gov/adds, http://weather.aero/1 Graphical Turbulence Guidance product, GTG -2.5...analysis GTG - Graphical Turbulence Guidance HRMM - High Resolution Mesoscale/Microscale ICD - Interface Control Document IDE - Integrated Development...site (with GTG 2.5 data) http://www.aviationweather.gov/turbuience • ADDS Experimental site http://weather.aero/ • NCEP FNL data - http

  14. A meta-analysis of the relationship between rational beliefs and psychological distress.

    PubMed

    Oltean, Horea-Radu; David, Daniel Ovidiu

    2018-06-01

     Rational emotive behavior therapy (REBT) model of psychological health assumes that rational beliefs cause functional emotions and adaptive behavior, but the presumed role of rational beliefs as protective factor against psychological distress/disorders is still in debate. An important step in validating an evidence-based therapy is to investigate the underling theoretical assumptions. Thus, the aim of the present meta-analysis is to investigate the direction and magnitude of the relationship between rational beliefs and psychological distress.  Our search identified 26 studies that met our criteria. We evaluated the effect size using the random-effects model and we tested the moderator role of several variables. The overall results revealed a medium negative association between rational beliefs and psychological distress, r = -0.31. The strongest association was found for unconditional acceptance beliefs (r = -0.41). The results add empirical evidence for the underling theory of REBT and revealed that the strength of the association between rational beliefs and distress is robust for a wide range of emotional problems. Therefore, rational beliefs could be a trans-diagnostic protective factor against distress. Moreover, results emphasized that rational beliefs type is an important factor, suggesting an increased focus in therapy on the developing of unconditional acceptance and self-acceptance beliefs. © 2017 Wiley Periodicals, Inc.

  15. A new paradigm for clinical communication: critical review of literature in cancer care.

    PubMed

    Salmon, Peter; Young, Bridget

    2017-03-01

    To: (i) identify key assumptions of the scientific 'paradigm' that shapes clinical communication research and education in cancer care; (ii) show that, as general rules, these do not match patients' own priorities for communication; and (iii) suggest how the paradigm might change to reflect evidence better and thereby serve patients better. A critical review, focusing on cancer care. We identified assumptions about patients' and clinicians' roles in recent position and policy statements. We examined these in light of research evidence, focusing on inductive research that has not itself been constrained by those assumptions, and considering the institutionalised interests that the assumptions might serve. The current paradigm constructs patients simultaneously as needy (requiring clinicians' explicit emotional support) and robust (seeking information and autonomy in decision making). Evidence indicates, however, that patients generally value clinicians who emphasise expert clinical care rather than counselling, and who lead decision making. In denoting communication as a technical skill, the paradigm constructs clinicians as technicians; however, communication cannot be reduced to technical skills, and teaching clinicians 'communication skills' has not clearly benefited patients. The current paradigm is therefore defined by assumptions that that have not arisen from evidence. A paradigm for clinical communication that makes its starting point the roles that mortal illness gives patients and clinicians would emphasise patients' vulnerability and clinicians' goal-directed expertise. Attachment theory provides a knowledge base to inform both research and education. Researchers will need to be alert to political interests that seek to mould patients into 'consumers', and to professional interests that seek to add explicit psychological dimensions to clinicians' roles. New approaches to education will be needed to support clinicians' curiosity and goal-directed judgement in applying this knowledge. The test for the new paradigm will be whether the research and education it promotes benefit patients. © 2016 The Authors. Medical Education published by Association for the Study of Medical Education and John Wiley & Sons Ltd.

  16. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  17. Impact of actuarial assumptions on pension costs: A simulation analysis

    NASA Astrophysics Data System (ADS)

    Yusof, Shaira; Ibrahim, Rose Irnawaty

    2013-04-01

    This study investigates the sensitivity of pension costs to changes in the underlying assumptions of a hypothetical pension plan in order to gain a perspective on the relative importance of the various actuarial assumptions via a simulation analysis. Simulation analyses are used to examine the impact of actuarial assumptions on pension costs. There are two actuarial assumptions will be considered in this study which are mortality rates and interest rates. To calculate pension costs, Accrued Benefit Cost Method, constant amount (CA) modification, constant percentage of salary (CS) modification are used in the study. The mortality assumptions and the implied mortality experience of the plan can potentially have a significant impact on pension costs. While for interest rate assumptions, it is inversely related to the pension costs. Results of the study have important implications for analyst of pension costs.

  18. Making Sense out of Sex Stereotypes in Advertising: A Feminist Analysis of Assumptions.

    ERIC Educational Resources Information Center

    Ferrante, Karlene

    Sexism and racism in advertising have been well documented, but feminist research aimed at social change must go beyond existing content analyses to ask how advertising is created. Analysis of the "mirror assumption" (advertising reflects society) and the "gender assumption" (advertising speaks in a male voice to female…

  19. Beware the black box: investigating the sensitivity of FEA simulations to modelling factors in comparative biomechanics.

    PubMed

    Walmsley, Christopher W; McCurry, Matthew R; Clausen, Phillip D; McHenry, Colin R

    2013-01-01

    Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be 'reasonable' are often assumed to have little influence on the results and their interpretation. HERE WE REPORT AN EXTENSIVE SENSITIVITY ANALYSIS WHERE HIGH RESOLUTION FINITE ELEMENT (FE) MODELS OF MANDIBLES FROM SEVEN SPECIES OF CROCODILE WERE ANALYSED UNDER LOADS TYPICAL FOR COMPARATIVE ANALYSIS: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results.

  20. Beware the black box: investigating the sensitivity of FEA simulations to modelling factors in comparative biomechanics

    PubMed Central

    McCurry, Matthew R.; Clausen, Phillip D.; McHenry, Colin R.

    2013-01-01

    Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be ‘reasonable’ are often assumed to have little influence on the results and their interpretation. Here we report an extensive sensitivity analysis where high resolution finite element (FE) models of mandibles from seven species of crocodile were analysed under loads typical for comparative analysis: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results. PMID:24255817

  1. Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1997-01-01

    Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…

  2. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    PubMed

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    PubMed

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Skin dose saving of the staff in 90Y/177Lu peptide receptor radionuclide therapy with the automatic dose dispenser.

    PubMed

    Fioroni, Federica; Grassi, Elisa; Giorgia, Cavatorta; Sara, Rubagotti; Piccagli, Vando; Filice, Angelina; Mostacci, Domiziano; Versari, Annibale; Iori, Mauro

    2016-10-01

    When handling Y-labelled and Lu-labelled radiopharmaceuticals, skin exposure is mainly due to β-particles. This study aimed to investigate the equivalent dose saving of the staff when changing from an essentially manual radiolabelling procedure to an automatic dose dispenser (ADD). The chemist and physician were asked to wear thermoluminescence dosimeters on their fingertips to evaluate the quantity of Hp(0.07) on the skin. Data collected were divided into two groups: before introducing ADD (no ADD) and after introducing ADD. For the chemist, the mean values (95th percentile) of Hp(0.07) for no ADD and ADD are 0.030 (0.099) and 0.019 (0.076) mSv/GBq, respectively, for Y, and 0.022 (0.037) and 0.007 (0.023) mSv/GBq, respectively, for Lu. The reduction for ADD was significant (t-test with P<0.05) for both isotopes. The relative differences before and after ADD collected for every finger were treated using the Wilcoxon test, proving a significantly higher reduction in extremity dose to each fingertip for Lu than for Y (P<0.05). For the medical staff, the mean values of Hp(0.07) (95th percentile) for no ADD and ADD are 0.021 (0.0762) and 0.0143 (0.0565) mSv/GBq, respectively, for Y, and 0.0011 (0.00196) and 0.0009 (0.00263) mSv/GBq, respectively, for Lu. The t-test provided a P-value less than 0.05 for both isotopes, making the difference between ADD and no ADD significant. ADD positively affects the dose saving of the chemist in handling both isotopes. For the medical staff not directly involved with the introduction of the ADD system, the analysis shows a learning curve of the workers over a 5-year period. Specific devices and procedures allow staff skin dose to be limited.

  5. Niches, models, and climate change: Assessing the assumptions and uncertainties

    PubMed Central

    Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.

    2009-01-01

    As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750

  6. Using "Excel" for White's Test--An Important Technique for Evaluating the Equality of Variance Assumption and Model Specification in a Regression Analysis

    ERIC Educational Resources Information Center

    Berenson, Mark L.

    2013-01-01

    There is consensus in the statistical literature that severe departures from its assumptions invalidate the use of regression modeling for purposes of inference. The assumptions of regression modeling are usually evaluated subjectively through visual, graphic displays in a residual analysis but such an approach, taken alone, may be insufficient…

  7. A developmental dose-response analysis of the effects of methylphenidate on the peer interactions of attention deficit disordered boys.

    PubMed

    Cunningham, C E; Siegel, L S; Offord, D R

    1985-11-01

    Mixed dyads of 42 normal and 42 ADD boys were videotaped in free play, co-operative task, and simulated classrooms. ADD boys received placebo, 0.15 mg/kg, and 0.50 mg/kg of methylphenidate. ADD boys were more active and off task, watched peers less, and scored lower on mathematics and visual-motor tasks. Older boys interacted less, ignored peer interactions and play more frequently, were less controlling, and more compliant. In class, methylphenidate improved visual motor scores, and reduced the controlling behaviour, activity level, and off task behaviour of ADD boys. Normal peers displayed reciprocal reductions in controlling behaviour, activity level, and off task behaviour.

  8. Dimeric spectra analysis in Microsoft Excel: a comparative study.

    PubMed

    Gilani, A Ghanadzadeh; Moghadam, M; Zakerhamidi, M S

    2011-11-01

    The purpose of this work is to introduce the reader to an Add-in implementation, Decom. This implementation provides the whole processing requirements for analysis of dimeric spectra. General linear and nonlinear decomposition algorithms were integrated as an Excel Add-in for easy installation and usage. In this work, the results of several samples investigations were compared to those obtained by Datan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Mid-frequency Environmental and Acoustic Studies From SW06, and Applications to Asian Littoral Waters

    DTIC Science & Technology

    2008-09-30

    Research Laboratory, the Korean Agency for Defense Development (ADD), Hanyang University ( HYU ), to undertake collaborative research programs in...objective was to participate in experiment off the coast of Korea with NRL, ADD and HYU that occurred in August. APPROACH The main item of...directions with respect to the MORAY. Key individuals involved in SW06 data analysis include Jeewoong Choi ( HYU ) in relation to propagation analysis

  10. One Hundred Ways to be Non-Fickian - A Rigorous Multi-Variate Statistical Analysis of Pore-Scale Transport

    NASA Astrophysics Data System (ADS)

    Most, Sebastian; Nowak, Wolfgang; Bijeljic, Branko

    2015-04-01

    Fickian transport in groundwater flow is the exception rather than the rule. Transport in porous media is frequently simulated via particle methods (i.e. particle tracking random walk (PTRW) or continuous time random walk (CTRW)). These methods formulate transport as a stochastic process of particle position increments. At the pore scale, geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Hence, it is important to get a better understanding of the processes at pore scale. For our analysis we track the positions of 10.000 particles migrating through the pore space over time. The data we use come from micro CT scans of a homogeneous sandstone and encompass about 10 grain sizes. Based on those images we discretize the pore structure and simulate flow at the pore scale based on the Navier-Stokes equation. This flow field realistically describes flow inside the pore space and we do not need to add artificial dispersion during the transport simulation. Next, we use particle tracking random walk and simulate pore-scale transport. Finally, we use the obtained particle trajectories to do a multivariate statistical analysis of the particle motion at the pore scale. Our analysis is based on copulas. Every multivariate joint distribution is a combination of its univariate marginal distributions. The copula represents the dependence structure of those univariate marginals and is therefore useful to observe correlation and non-Gaussian interactions (i.e. non-Fickian transport). The first goal of this analysis is to better understand the validity regions of commonly made assumptions. We are investigating three different transport distances: 1) The distance where the statistical dependence between particle increments can be modelled as an order-one Markov process. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks start. 2) The distance where bivariate statistical dependence simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW/CTRW). 3) The distance of complete statistical independence (validity of classical PTRW/CTRW). The second objective is to reveal characteristic dependencies influencing transport the most. Those dependencies can be very complex. Copulas are highly capable of representing linear dependence as well as non-linear dependence. With that tool we are able to detect persistent characteristics dominating transport even across different scales. The results derived from our experimental data set suggest that there are many more non-Fickian aspects of pore-scale transport than the univariate statistics of longitudinal displacements. Non-Fickianity can also be found in transverse displacements, and in the relations between increments at different time steps. Also, the found dependence is non-linear (i.e. beyond simple correlation) and persists over long distances. Thus, our results strongly support the further refinement of techniques like correlated PTRW or correlated CTRW towards non-linear statistical relations.

  11. A Managerial Approach to Compensation

    ERIC Educational Resources Information Center

    Wolfe, Arthur V.

    1975-01-01

    The article examines the major external forces constraining equitable employee compensation, sets forth the classical employee compensation assumptions, suggests somewhat more realistic employee compensation assumptions, and proposes guidelines based on analysis of these external constraints and assumptions. (Author)

  12. A complete graphical criterion for the adjustment formula in mediation analysis.

    PubMed

    Shpitser, Ilya; VanderWeele, Tyler J

    2011-03-04

    Various assumptions have been used in the literature to identify natural direct and indirect effects in mediation analysis. These effects are of interest because they allow for effect decomposition of a total effect into a direct and indirect effect even in the presence of interactions or non-linear models. In this paper, we consider the relation and interpretation of various identification assumptions in terms of causal diagrams interpreted as a set of non-parametric structural equations. We show that for such causal diagrams, two sets of assumptions for identification that have been described in the literature are in fact equivalent in the sense that if either set of assumptions holds for all models inducing a particular causal diagram, then the other set of assumptions will also hold for all models inducing that diagram. We moreover build on prior work concerning a complete graphical identification criterion for covariate adjustment for total effects to provide a complete graphical criterion for using covariate adjustment to identify natural direct and indirect effects. Finally, we show that this criterion is equivalent to the two sets of independence assumptions used previously for mediation analysis.

  13. A common reference-based indirect comparison meta-analysis of eslicarbazepine versus lacosamide as add on treatments for focal epilepsy.

    PubMed

    Brigo, Francesco; Trinka, Eugen; Bragazzi, Nicola Luigi; Nardone, Raffaele; Milan, Alberto; Grillo, Elisabetta

    2016-11-01

    Eslicarbazepine acetate (ESL) and lacosamide (LCM) have recently emerged as add-on treatments in patients with focal epilepsy experiencing seizures despite adequate monotherapy. Both drugs enhance slow inactivation of voltage-gated sodium channels. To date no randomized controlled trial (RCT) has directly compared ESL with LCM as add-on treatments for focal epilepsy. Our aim was to indirectly compare the efficacy of ESL and LCM used as add-on treatments in patients with focal epilepsy using common reference-based indirect comparison meta-analysis. We systematically searched RCTs in which ESL or LCM has been used as add-on treatment in patients with focal epilepsy and compared with placebo. Following outcomes were considered: ≥50% reduction in seizure frequency; seizure freedom; treatment withdrawal for any reason; ≥25% increase in seizure frequency. Random-effects Mantel-Haenszel meta-analyses were performed to obtain odds ratios (ORs) for the efficacy of ESL or LCM versus placebo. Adjusted indirect comparisons were then made between ESL and LCM using the obtained results, and comparing the minimum and the highest effective recommended daily dose of each drug. Eight studies were included. Indirect comparisons adjusted for dose-effect showed no difference between ESL and LCM for responder rate, seizure freedom, and withdrawal rates. We could not assess increase in seizure frequency due to lack of data. Indirect comparisons failed to find a significant difference in efficacy between add-on ESL and LCM in patients with focal epilepsy. Direct head-to-head clinical trials comparing ESL with LCM as add-on antiepileptic treatment are required to confirm these results. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Miners, masculinity and the "Bataille du Charbon" in France, 1944-1948.

    PubMed

    Diamond, Hanna

    2011-01-01

    In 1944, the French provisional government, backed by the Parti communiste français and the Confédération générale du travail, undertook an aggressive propaganda campaign to persuade miners to embark upon a 'battle for coal' which raised their efforts in extracting coal to that of a national endeavour. At the same time, miners had great hopes that nationalisation of the coal industry, under discussion at this time, would bring significant improvement to their working lives. In identifying the ways in which publicists posited miners as an ideal of working-class manhood, this article will argue that "la bataille du charbon" marks a crucial moment in the celebration of working-class masculinity and that the "statut des mineurs" which was passed in 1946 as a part of nationalisation enshrined many of the existing gender assumptions about mining life. What does an incorporation of gender to an analysis of the treatment of miners in the years 1944-1948 add to our understandings of the various economic, political and social dynamics around "la bataille du charbon"? How do these insights inform our perceptions of French coalfield societies in the mid-twentieth century?

  15. GLACiAR, an Open-Source Python Tool for Simulations of Source Recovery and Completeness in Galaxy Surveys

    NASA Astrophysics Data System (ADS)

    Carrasco, D.; Trenti, M.; Mutch, S.; Oesch, P. A.

    2018-06-01

    The luminosity function is a fundamental observable for characterising how galaxies form and evolve throughout the cosmic history. One key ingredient to derive this measurement from the number counts in a survey is the characterisation of the completeness and redshift selection functions for the observations. In this paper, we present GLACiAR, an open python tool available on GitHub to estimate the completeness and selection functions in galaxy surveys. The code is tailored for multiband imaging surveys aimed at searching for high-redshift galaxies through the Lyman-break technique, but it can be applied broadly. The code generates artificial galaxies that follow Sérsic profiles with different indexes and with customisable size, redshift, and spectral energy distribution properties, adds them to input images, and measures the recovery rate. To illustrate this new software tool, we apply it to quantify the completeness and redshift selection functions for J-dropouts sources (redshift z 10 galaxies) in the Hubble Space Telescope Brightest of Reionizing Galaxies Survey. Our comparison with a previous completeness analysis on the same dataset shows overall agreement, but also highlights how different modelling assumptions for the artificial sources can impact completeness estimates.

  16. Novel tunable dynamic tweezers using dark-bright soliton collision control in an optical add/drop filter.

    PubMed

    Teeka, Chat; Jalil, Muhammad Arif; Yupapin, Preecha P; Ali, Jalil

    2010-12-01

    We propose a novel system of the dynamic optical tweezers generated by a dark soliton in the fiber optic loop. A dark soliton known as an optical tweezer is amplified and tuned within the microring resonator system. The required tunable tweezers with different widths and powers can be controlled. The analysis of dark-bright soliton conversion using a dark soliton pulse propagating within a microring resonator system is analyzed. The dynamic behaviors of soliton conversion in add/drop filter is also analyzed. The control dark soliton is input into the system via the add port of the add/drop filter. The dynamic behavior of the dark-bright soliton conversion is observed. The required stable signal is obtained via a drop and throughput ports of the add/drop filter with some suitable parameters. In application, the trapped light/atom and transportation can be realized by using the proposed system.

  17. Antileukotriene Agents Versus Long-Acting Beta-Agonists in Older Adults with Persistent Asthma: A Comparison of Add-On Therapies.

    PubMed

    Altawalbeh, Shoroq M; Thorpe, Carolyn T; Zgibor, Janice C; Kane-Gill, Sandra; Kang, Yihuang; Thorpe, Joshua M

    2016-08-01

    To compare the effectiveness and cardiovascular safety of long-acting beta-agonists (LABAs) with those of leukotriene receptor antagonists (LTRAs) as add-on treatments in older adults with asthma already taking inhaled corticosteroids (ICSs). Retrospective cohort study. Medicare fee-for-service (FFS) claims (2009-10) for a 10% random sample of beneficiaries continuously enrolled in Parts A, B, and D in 2009. Medicare beneficiaries aged 66 and older continuously enrolled in FFS Medicare with Part D coverage with a diagnosis of asthma before 2009 treated exclusively with ICSs plus LABAs or ICSs plus LTRAs (N = 14,702). The augmented inverse propensity-weighted estimator was used to compare the effect of LABA add-on therapy with that of LTRA add-on therapy on asthma exacerbations requiring inpatient, emergency, or outpatient care and on cardiovascular (CV) events, adjusting for demographic characteristics, comorbidities, and county-level healthcare-access variables. The primary analysis showed that LTRA add-on treatment was associated with greater odds of asthma-related hospitalizations or emergency department visits (odds ratio (OR) = 1.4, P < .001), as well as outpatient exacerbations requiring oral corticosteroids or antibiotics (OR = 1.41, P < .001) than LABA treatment. LTRA add-on therapy was also less effective in controlling acute symptoms, as indicated by greater use of short-acting beta agonists (rate ratio = 1.58, P < .001). LTRA add-on treatment was associated with lower odds of experiencing a CV event than LABA treatment (OR = 0.86, P = .006). This study provides new evidence specific to older adults to help healthcare providers weigh the risks and benefits of these add-on treatments. Further subgroup analysis is needed to personalize asthma treatments in this high-risk population. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.

  18. Increased costs to US pavement infrastructure from future temperature rise

    NASA Astrophysics Data System (ADS)

    Underwood, B. Shane; Guido, Zack; Gudipudi, Padmini; Feinberg, Yarden

    2017-10-01

    Roadway design aims to maximize functionality, safety, and longevity. The materials used for construction, however, are often selected on the assumption of a stationary climate. Anthropogenic climate change may therefore result in rapid infrastructure failure and, consequently, increased maintenance costs, particularly for paved roads where temperature is a key determinant for material selection. Here, we examine the economic costs of projected temperature changes on asphalt roads across the contiguous United States using an ensemble of 19 global climate models forced with RCP 4.5 and 8.5 scenarios. Over the past 20 years, stationary assumptions have resulted in incorrect material selection for 35% of 799 observed locations. With warming temperatures, maintaining the standard practice for material selection is estimated to add approximately US$13.6, US$19.0 and US$21.8 billion to pavement costs by 2010, 2040 and 2070 under RCP4.5, respectively, increasing to US$14.5, US$26.3 and US$35.8 for RCP8.5. These costs will disproportionately affect local municipalities that have fewer resources to mitigate impacts. Failing to update engineering standards of practice in light of climate change therefore significantly threatens pavement infrastructure in the United States.

  19. Systematic Review: A Reevaluation and Update of the Integrative (Trajectory) Model of Pediatric Medical Traumatic Stress.

    PubMed

    Price, Julia; Kassam-Adams, Nancy; Alderfer, Melissa A; Christofferson, Jennifer; Kazak, Anne E

    2016-01-01

    The objective of this systematic review is to reevaluate and update the Integrative Model of Pediatric Medical Traumatic Stress (PMTS; Kazak et al., 2006), which provides a conceptual framework for traumatic stress responses across pediatric illnesses and injuries. Using established systematic review guidelines, we searched PsycINFO, Cumulative Index to Nursing and Allied Health Literature, and PubMed (producing 216 PMTS papers published since 2005), extracted findings for review, and organized and interpreted findings within the Integrative Model framework. Recent PMTS research has included additional pediatric populations, used advanced longitudinal modeling techniques, clarified relations between parent and child PMTS, and considered effects of PMTS on health outcomes. Results support and extend the model's five assumptions, and suggest a sixth assumption related to health outcomes and PMTS. Based on new evidence, the renamed Integrative Trajectory Model includes phases corresponding with medical events, adds family-centered trajectories, reaffirms a competency-based framework, and suggests updated assessment and intervention implications. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Useful global-change scenarios: current issues and challenges

    NASA Astrophysics Data System (ADS)

    Parson, E. A.

    2008-10-01

    Scenarios are increasingly used to inform global-change debates, but their connection to decisions has been weak and indirect. This reflects the greater number and variety of potential users and scenario needs, relative to other decision domains where scenario use is more established. Global-change scenario needs include common elements, e.g., model-generated projections of emissions and climate change, needed by many users but in different ways and with different assumptions. For these common elements, the limited ability to engage diverse global-change users in scenario development requires extreme transparency in communicating underlying reasoning and assumptions, including probability judgments. Other scenario needs are specific to users, requiring a decentralized network of scenario and assessment organizations to disseminate and interpret common elements and add elements requiring local context or expertise. Such an approach will make global-change scenarios more useful for decisions, but not less controversial. Despite predictable attacks, scenario-based reasoning is necessary for responsible global-change decisions because decision-relevant uncertainties cannot be specified scientifically. The purpose of scenarios is not to avoid speculation, but to make the required speculation more disciplined, more anchored in relevant scientific knowledge when available, and more transparent.

  1. Understanding the Cycle

    PubMed Central

    Currie, Janet; Tekin, Erdal

    2013-01-01

    Child maltreatment is a major social problem. This paper focuses on measuring the relationship between child maltreatment and crime using data from the National Longitudinal Study of Adolescent Health (Add Health). We focus on crime because it is one of the most costly potential outcomes of maltreatment. Our work addresses two main limitations of the existing literature on child maltreatment. First, we use a large national sample, and investigate different types of maltreatment in a unified framework. Second, we pay careful attention to controlling for possible confounders using a variety of statistical methods that make differing assumptions. The results suggest that maltreatment greatly increases the probability of engaging in crime and that the probability increases with the experience of multiple forms of maltreatment. PMID:24204082

  2. Finance for practicing radiologists.

    PubMed

    Berlin, Jonathan W; Lexa, Frank James

    2005-03-01

    This article reviews basic finance for radiologists. Using the example of a hypothetical outpatient computed tomography center, readers are introduced to the concept of net present value. This concept refers to the current real value of anticipated income in the future, realizing that revenue in the future has less value than it does today. Positive net present value projects add wealth to a practice and should be pursued. The article details how costs and revenues for a hypothetical outpatient computed tomography center are determined and elucidates the difference between fixed costs and variable costs. The article provides readers with the steps used to calculate the break-even volume for an outpatient computed tomography center given situation-specific assumptions regarding staff, equipment lease rates, rent, and third-party payer mix.

  3. Privacy-preserving heterogeneous health data sharing.

    PubMed

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-05-01

    Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.

  4. Privacy-preserving heterogeneous health data sharing

    PubMed Central

    Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila

    2013-01-01

    Objective Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. Methods The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. Results We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. Limitation The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Conclusions Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis. PMID:23242630

  5. Role of diversity in ICA and IVA: theory and applications

    NASA Astrophysics Data System (ADS)

    Adalı, Tülay

    2016-05-01

    Independent component analysis (ICA) has been the most popular approach for solving the blind source separation problem. Starting from a simple linear mixing model and the assumption of statistical independence, ICA can recover a set of linearly-mixed sources to within a scaling and permutation ambiguity. It has been successfully applied to numerous data analysis problems in areas as diverse as biomedicine, communications, finance, geo- physics, and remote sensing. ICA can be achieved using different types of diversity—statistical property—and, can be posed to simultaneously account for multiple types of diversity such as higher-order-statistics, sample dependence, non-circularity, and nonstationarity. A recent generalization of ICA, independent vector analysis (IVA), generalizes ICA to multiple data sets and adds the use of one more type of diversity, statistical dependence across the data sets, for jointly achieving independent decomposition of multiple data sets. With the addition of each new diversity type, identification of a broader class of signals become possible, and in the case of IVA, this includes sources that are independent and identically distributed Gaussians. We review the fundamentals and properties of ICA and IVA when multiple types of diversity are taken into account, and then ask the question whether diversity plays an important role in practical applications as well. Examples from various domains are presented to demonstrate that in many scenarios it might be worthwhile to jointly account for multiple statistical properties. This paper is submitted in conjunction with the talk delivered for the "Unsupervised Learning and ICA Pioneer Award" at the 2016 SPIE Conference on Sensing and Analysis Technologies for Biomedical and Cognitive Applications.

  6. IkeNet: Social Network Analysis of E-mail Traffic in the Eisenhower Leadership Development Program

    DTIC Science & Technology

    2007-11-01

    8217Create the recipients TO TempArray = Sphit(strTo,") For Each varArrayltem In TemnpArray hextGuy = Chr(34) & CStr (Trim(varArrayltem)) & Chr(34) MsgBox...34next guy = " & nextGuy ’Set oRecipient = Recipients.Add(nextGuy) Set oRecipient = Recipients.Add( CStr (Trim(varArrayItem))) oRecipient.Type = olTo...TempArray = Split(strAttachments, "" For Each varArrayltern In TempArray .Attachments.Add CStr (Trim(varArrayItem)) Next varArrayltern .Send No return value

  7. Evolutionary diversification of protein-protein interactions by interface add-ons.

    PubMed

    Plach, Maximilian G; Semmelmann, Florian; Busch, Florian; Busch, Markus; Heizinger, Leonhard; Wysocki, Vicki H; Merkl, Rainer; Sterner, Reinhard

    2017-10-03

    Cells contain a multitude of protein complexes whose subunits interact with high specificity. However, the number of different protein folds and interface geometries found in nature is limited. This raises the question of how protein-protein interaction specificity is achieved on the structural level and how the formation of nonphysiological complexes is avoided. Here, we describe structural elements called interface add-ons that fulfill this function and elucidate their role for the diversification of protein-protein interactions during evolution. We identified interface add-ons in 10% of a representative set of bacterial, heteromeric protein complexes. The importance of interface add-ons for protein-protein interaction specificity is demonstrated by an exemplary experimental characterization of over 30 cognate and hybrid glutamine amidotransferase complexes in combination with comprehensive genetic profiling and protein design. Moreover, growth experiments showed that the lack of interface add-ons can lead to physiologically harmful cross-talk between essential biosynthetic pathways. In sum, our complementary in silico, in vitro, and in vivo analysis argues that interface add-ons are a practical and widespread evolutionary strategy to prevent the formation of nonphysiological complexes by specializing protein-protein interactions.

  8. Three-class ROC analysis--the equal error utility assumption and the optimality of three-class ROC surface using the ideal observer.

    PubMed

    He, Xin; Frey, Eric C

    2006-08-01

    Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.

  9. The Doctor Is In! Diagnostic Analysis.

    PubMed

    Jupiter, Daniel C

    To make meaningful inferences based on our regression models, we must ensure that we have met the necessary assumptions of these tests. In this commentary, we review these assumptions and those for the t-test and analysis of variance, and introduce a variety of methods, formal and informal, numeric and visual, for assessing conformity with the assumptions. Copyright © 2018 The American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  10. 75 FR 64221 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-19

    ... combination of combustion and post-combustion controls. EPA approached the five factor analysis using a top... from fuel-bound nitrogen and high temperature combustion; (2) post- combustion add-on control to reduce... is a combination of a post- combustion add-on control, i.e., selective catalytic reduction (SCR), and...

  11. Inflation of the type I error: investigations on regulatory recommendations for bioequivalence of highly variable drugs.

    PubMed

    Wonnemann, Meinolf; Frömke, Cornelia; Koch, Armin

    2015-01-01

    We investigated different evaluation strategies for bioequivalence trials with highly variable drugs on their resulting empirical type I error and empirical power. The classical 'unscaled' crossover design with average bioequivalence evaluation, the Add-on concept of the Japanese guideline, and the current 'scaling' approach of EMA were compared. Simulation studies were performed based on the assumption of a single dose drug administration while changing the underlying intra-individual variability. Inclusion of Add-on subjects following the Japanese concept led to slight increases of the empirical α-error (≈7.5%). For the approach of EMA we noted an unexpected tremendous increase of the rejection rate at a geometric mean ratio of 1.25. Moreover, we detected error rates slightly above the pre-set limit of 5% even at the proposed 'scaled' bioequivalence limits. With the classical 'unscaled' approach and the Japanese guideline concept the goal of reduced subject numbers in bioequivalence trials of HVDs cannot be achieved. On the other hand, widening the acceptance range comes at the price that quite a number of products will be accepted bioequivalent that had not been accepted in the past. A two-stage design with control of the global α therefore seems the better alternative.

  12. Perspectives of clinician and biomedical scientists on interdisciplinary health research.

    PubMed

    Laberge, Suzanne; Albert, Mathieu; Hodges, Brian D

    2009-11-24

    Interdisciplinary health research is a priority of many funding agencies. We surveyed clinician and biomedical scientists about their views on the value and funding of interdisciplinary health research. We conducted semistructured interviews with 31 biomedical and 30 clinician scientists. The scientists were selected from the 2000-2006 membership lists of peer-review committees of the Canadian Institutes of Health Research. We investigated respondents' perspectives on the assumption that collaboration across disciplines adds value to health research. We also investigated their perspectives on funding agencies' growing support of interdisciplinary research. The 61 respondents expressed a wide variety of perspectives on the value of interdisciplinary health research, ranging from full agreement (22) to complete disagreement (11) that it adds value; many presented qualified viewpoints (28). More than one-quarter viewed funding agencies' growing support of interdisciplinary research as appropriate. Most (44) felt that the level of support was unwarranted. Arguments included the belief that current support leads to the creation of artificial teams and that a top-down process of imposing interdisciplinary structures on teams constrains scientists' freedom. On both issues we found contrasting trends between the clinician and the biomedical scientists. Despite having some positive views about the value of interdisciplinary research, scientists, especially biomedical scientists, expressed reservations about the growing support of interdisciplinary research.

  13. 'Add women & stir'--the biomedical approach to cardiac research!

    PubMed

    O'Donnell, Sharon; Condell, Sarah; Begley, Cecily M

    2004-07-01

    In conditions shared by women and men, the biomedical model of disease assumes that illness-symptoms and outcomes are biologically and socially 'neutral'. Consequently, up until a decade ago, white middle-aged men were the model subjects in most funded cardiac trials, with the assumption that whatever the findings, the results would also hold true for women. This 'add women and stir' approach has resulted in imbalances in cardiac care and an image of coronary artery disease, which portrays a middle-aged male as its victim. Moreover, cardiac health care has been designed with the male anatomy and male experience of illness in mind, and health promotional measures have been targeted towards men. Women have received these health promotional messages to protect the hearts of men, and have been less likely to modify their own lifestyles in a cardio-protective manner. However, the biological and social differences that exist between women and men, must surely invalidate such biased biomedical assertions, and signify a need to delve beyond the realm of biomedical reductionism for greater insights and understanding. This review examines how scientific reductionism has failed to explore the impact of coronary artery disease on the lives of women and how the gendered image of this disease has privileged the normative frame.

  14. Global analysis of double-strand break processing reveals in vivo properties of the helicase-nuclease complex AddAB

    PubMed Central

    Badrinarayanan, Anjana; Cisse, Ibrahim I.

    2017-01-01

    In bacteria, double-strand break (DSB) repair via homologous recombination is thought to be initiated through the bi-directional degradation and resection of DNA ends by a helicase-nuclease complex such as AddAB. The activity of AddAB has been well-studied in vitro, with translocation speeds between 400–2000 bp/s on linear DNA suggesting that a large section of DNA around a break site is processed for repair. However, the translocation rate and activity of AddAB in vivo is not known, and how AddAB is regulated to prevent excessive DNA degradation around a break site is unclear. To examine the functions and mechanistic regulation of AddAB inside bacterial cells, we developed a next-generation sequencing-based approach to assay DNA processing after a site-specific DSB was introduced on the chromosome of Caulobacter crescentus. Using this assay we determined the in vivo rates of DSB processing by AddAB and found that putative chi sites attenuate processing in a RecA-dependent manner. This RecA-mediated regulation of AddAB prevents the excessive loss of DNA around a break site, limiting the effects of DSB processing on transcription. In sum, our results, taken together with prior studies, support a mechanism for regulating AddAB that couples two key events of DSB repair–the attenuation of DNA-end processing and the initiation of homology search by RecA–thereby helping to ensure that genomic integrity is maintained during DSB repair. PMID:28489851

  15. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  16. Relationship between ADD1 Gly460Trp gene polymorphism and essential hypertension in Madeira Island.

    PubMed

    Sousa, Ana Célia; Palma Dos Reis, Roberto; Pereira, Andreia; Borges, Sofia; Freitas, Ana Isabel; Guerra, Graça; Góis, Teresa; Rodrigues, Mariana; Henriques, Eva; Freitas, Sónia; Ornelas, Ilídio; Pereira, Décio; Brehm, António; Mendonça, Maria Isabel

    2017-10-01

    Essential hypertension (EH) is a complex disease in which physiological, environmental, and genetic factors are involved in its genesis. The genetic variant of the alpha-adducin gene (ADD1) has been described as a risk factor for EH, but with controversial results.The objective of this study was to evaluate the association of ADD1 (Gly460Trp) gene polymorphism with the EH risk in a population from Madeira Island.A case-control study with 1614 individuals of Caucasian origin was performed, including 817 individuals with EH and 797 controls. Cases and controls were matched for sex and age, by frequency-matching method. All participants collected blood for biochemical and genotypic analysis for the Gly460Trp polymorphism. We further investigated which variables were independently associated to EH, and, consequently, analyzed their interactions.In our study, we found a significant association between the ADD1 gene polymorphism and EH (odds ratio 2.484, P = .01). This association remained statistically significant after the multivariate analysis (odds ratio 2.548, P = .02).The ADD1 Gly460Trp gene polymorphism is significantly and independently associated with EH risk in our population. The knowledge of genetic polymorphisms associated with EH is of paramount importance because it leads to a better understanding of the etiology and pathophysiology of this pathology.

  17. Relationship between ADD1 Gly460Trp gene polymorphism and essential hypertension in Madeira Island

    PubMed Central

    Sousa, Ana Célia; Palma dos Reis, Roberto; Pereira, Andreia; Borges, Sofia; Freitas, Ana Isabel; Guerra, Graça; Góis, Teresa; Rodrigues, Mariana; Henriques, Eva; Freitas, Sónia; Ornelas, Ilídio; Pereira, Décio; Brehm, António; Mendonça, Maria Isabel

    2017-01-01

    Abstract Essential hypertension (EH) is a complex disease in which physiological, environmental, and genetic factors are involved in its genesis. The genetic variant of the alpha-adducin gene (ADD1) has been described as a risk factor for EH, but with controversial results. The objective of this study was to evaluate the association of ADD1 (Gly460Trp) gene polymorphism with the EH risk in a population from Madeira Island. A case-control study with 1614 individuals of Caucasian origin was performed, including 817 individuals with EH and 797 controls. Cases and controls were matched for sex and age, by frequency-matching method. All participants collected blood for biochemical and genotypic analysis for the Gly460Trp polymorphism. We further investigated which variables were independently associated to EH, and, consequently, analyzed their interactions. In our study, we found a significant association between the ADD1 gene polymorphism and EH (odds ratio 2.484, P = .01). This association remained statistically significant after the multivariate analysis (odds ratio 2.548, P = .02). The ADD1 Gly460Trp gene polymorphism is significantly and independently associated with EH risk in our population. The knowledge of genetic polymorphisms associated with EH is of paramount importance because it leads to a better understanding of the etiology and pathophysiology of this pathology. PMID:29049185

  18. Blog Analysis: An Exploration of French Students' Perceptions towards Foreign Cultures during Their Overseas Internships

    ERIC Educational Resources Information Center

    Durand, Sandra

    2016-01-01

    Increasingly, tourism and hospitality university programs in France include internships which add a vocational dimension to the academic aspects of the course. These internships a) provide exposure to real world professional situations, b) add market value to the student experience, and c) offer a foothold for employment. The field of blog…

  19. Authoritarian Parenting and Asian Adolescent School Performance: Insights from the US and Taiwan

    ERIC Educational Resources Information Center

    Pong, Suet-ling; Johnston, Jamie; Chen, Vivien

    2010-01-01

    Our study re-examines the relationship between parenting and school performance among Asian students. We use two sources of data: wave I of the Adolescent Health Longitudinal Survey (Add Health), and waves I and II of the Taiwan Educational Panel Survey (TEPS). Analysis using Add Health reveals that the Asian-American/European-American difference…

  20. Evaluation of assumptions in soil moisture triple collocation analysis

    USDA-ARS?s Scientific Manuscript database

    Triple collocation analysis (TCA) enables estimation of error variances for three or more products that retrieve or estimate the same geophysical variable using mutually-independent methods. Several statistical assumptions regarding the statistical nature of errors (e.g., mutual independence and ort...

  1. Evolutionary diversification of protein–protein interactions by interface add-ons

    PubMed Central

    Plach, Maximilian G.; Semmelmann, Florian; Busch, Florian; Busch, Markus; Heizinger, Leonhard; Wysocki, Vicki H.; Sterner, Reinhard

    2017-01-01

    Cells contain a multitude of protein complexes whose subunits interact with high specificity. However, the number of different protein folds and interface geometries found in nature is limited. This raises the question of how protein–protein interaction specificity is achieved on the structural level and how the formation of nonphysiological complexes is avoided. Here, we describe structural elements called interface add-ons that fulfill this function and elucidate their role for the diversification of protein–protein interactions during evolution. We identified interface add-ons in 10% of a representative set of bacterial, heteromeric protein complexes. The importance of interface add-ons for protein–protein interaction specificity is demonstrated by an exemplary experimental characterization of over 30 cognate and hybrid glutamine amidotransferase complexes in combination with comprehensive genetic profiling and protein design. Moreover, growth experiments showed that the lack of interface add-ons can lead to physiologically harmful cross-talk between essential biosynthetic pathways. In sum, our complementary in silico, in vitro, and in vivo analysis argues that interface add-ons are a practical and widespread evolutionary strategy to prevent the formation of nonphysiological complexes by specializing protein–protein interactions. PMID:28923934

  2. Mortality of Dandy-Walker syndrome in the United States: Analysis by race, gender, and insurance status.

    PubMed

    McClelland, Shearwood; Ukwuoma, Onyinyechi I; Lunos, Scott; Okuyemi, Kolawole S

    2015-01-01

    Dandy-Walker syndrome (DWS) is a congenital disorder often diagnosed in early childhood. Typically manifesting with signs/symptoms of increased intracranial pressure, DWS is catastrophic unless timely neurosurgical care can be administered via cerebrospinal fluid (CSF) drainage. The rates of mortality, adverse discharge disposition (ADD), and CSF drainage in DWS may not be uniform regardless of race, gender or insurance status; such differences could reflect disparities in access to neurosurgical care. This study examines these issues on a nationwide level. The Kids' Inpatient Database spanning 1997-2003 was used for analysis. Only patients admitted for DWS (ICD-9-CM = 742.3) were included. Multivariate analysis was adjusted for several variables, including patient age, race, sex, admission type, primary payer, income, and hospital volume. More than 14,000 DWS patients were included. Increasing age predicted reduced mortality (OR = 0.87; P < 0.05), ADD (OR = 0.96; P < 0.05), and decreased likelihood of receiving CSF drainage (OR = 0.86; P < 0.0001). Elective admission type predicted reduced mortality (OR = 0.29; P = 0.0008), ADD (OR = 0.68; P < 0.05), and increased CSF drainage (OR = 2.02; P < 0.0001). African-American race (OR = 1.20; P < 0.05) and private insurance (OR = 1.18; P < 0.05) each predicted increased likelihood of receiving CSF drainage, but were not predictors of mortality or ADD. Gender, income, and hospital volume were not significant predictors of DWS outcome. Increasing age and elective admissions each decrease mortality and ADD associated with DWS. African-American race and private insurance status increase access to CSF drainage. These findings contradict previous literature citing African-American race as a risk factor for mortality in DWS, and emphasize the role of private insurance in obtaining access to potentially lifesaving operative care.

  3. Evolution and the complexity of bacteriophages.

    PubMed

    Serwer, Philip

    2007-03-13

    The genomes of both long-genome (> 200 Kb) bacteriophages and long-genome eukaryotic viruses have cellular gene homologs whose selective advantage is not explained. These homologs add genomic and possibly biochemical complexity. Understanding their significance requires a definition of complexity that is more biochemically oriented than past empirically based definitions. Initially, I propose two biochemistry-oriented definitions of complexity: either decreased randomness or increased encoded information that does not serve immediate needs. Then, I make the assumption that these two definitions are equivalent. This assumption and recent data lead to the following four-part hypothesis that explains the presence of cellular gene homologs in long bacteriophage genomes and also provides a pathway for complexity increases in prokaryotic cells: (1) Prokaryotes underwent evolutionary increases in biochemical complexity after the eukaryote/prokaryote splits. (2) Some of the complexity increases occurred via multi-step, weak selection that was both protected from strong selection and accelerated by embedding evolving cellular genes in the genomes of bacteriophages and, presumably, also archaeal viruses (first tier selection). (3) The mechanisms for retaining cellular genes in viral genomes evolved under additional, longer-term selection that was stronger (second tier selection). (4) The second tier selection was based on increased access by prokaryotic cells to improved biochemical systems. This access was achieved when DNA transfer moved to prokaryotic cells both the more evolved genes and their more competitive and complex biochemical systems. I propose testing this hypothesis by controlled evolution in microbial communities to (1) determine the effects of deleting individual cellular gene homologs on the growth and evolution of long genome bacteriophages and hosts, (2) find the environmental conditions that select for the presence of cellular gene homologs, (3) determine which, if any, bacteriophage genes were selected for maintaining the homologs and (4) determine the dynamics of homolog evolution. This hypothesis is an explanation of evolutionary leaps in general. If accurate, it will assist both understanding and influencing the evolution of microbes and their communities. Analysis of evolutionary complexity increase for at least prokaryotes should include analysis of genomes of long-genome bacteriophages.

  4. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  5. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    PubMed

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  6. A cytogenetic analysis of 2 cases of phosphaturic mesenchymal tumor of mixed connective tissue type.

    PubMed

    Graham, Rondell P; Hodge, Jennelle C; Folpe, Andrew L; Oliveira, Andre M; Meyer, Kevin J; Jenkins, Robert B; Sim, Franklin H; Sukov, William R

    2012-08-01

    Phosphaturic mesenchymal tumor of mixed connective tissue type is a rare, histologically distinctive mesenchymal neoplasm associated with tumor-induced osteomalacia resulting from production of the phosphaturic hormone fibroblast growth factor 23. Because of its rarity, specific genetic alterations that contribute to the pathogenesis of these tumors have yet to be elucidated. Herein, we report the abnormal karyotypes from 2 cases of confirmed phosphaturic mesenchymal tumor of mixed connective tissue type. G-banded analysis demonstrated the first tumor to have a karyotype of 46,Y,t(X;3;14)(q13;p25;q21)[15]/46XY[5], and the second tumor to have a karyotype of 46, XY,add(2)(q31),add(4)(q31.1)[2]/92,slx2[3]/46,sl,der(2)t(2;4)(q14.2;p14),der(4)t(2;4)(q14.2;p14),add(4)(q31.1)[10]/46,sdl,add(13)(q34)[4]/92,sdl2x2[1]. These represent what is, to our knowledge, the first examples of abnormal karyotypes obtained from phosphaturic mesenchymal tumor of mixed connective tissue type. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. [Cost-benefit analysis of the implementation of automated drug-dispensing systems in Critical Care and Emergency Units].

    PubMed

    Poveda Andrés, J L; García Gómez, C; Hernández Sansalvador, M; Valladolid Walsh, A

    2003-01-01

    To determine monetary impact when traditional drug floor stocks are replaced by Automated Drug Dispensing Systems (ADDS) in the Medical Intensive Care Unit, Surgical Intensive Care Unit and the Emergency Room. We analysed four different flows considered to be determinant when implementing ADDS in a hospital environment: capital investment, staff costs, inventory costs and costs related to drug use policies. Costs were estimated by calculation of the current net value. Its analysis shows that those expenses derived from initial investment are compensated by the three remaining flows, with costs related to drug use policies showing the most substantial savings. Five years after initial investment, global cash-flows have been estimated at 300.525 euros. Replacement of traditional floor stocks by ADDS in the Medical Intensive Care Unit, Surgery Intensive Care Unit and the Emergency Room produces a positive benefit/cost ratio (1.95).

  8. Publish unexpected results that conflict with assumptions

    USDA-ARS?s Scientific Manuscript database

    Some widely held scientific assumptions have been discredited, whereas others are just inappropriate for many applications. Sometimes, a widely-held analysis procedure takes on a life of its own, forgetting the original purpose of the analysis. The peer-reviewed system makes it difficult to get a pa...

  9. Prevalence and Correlates of Substance Use in Black, White, and Biracial Black-White Adolescents: Evidence for a Biracial Intermediate Phenomena

    PubMed Central

    Goings, Trenette Clark; Butler-Bente, Emily; McGovern, Tricia; Howard, Matthew O.

    2016-01-01

    Most substance-use prevention interventions are based on the implicit assumption that risk and protective factors for substance use are the same for biracial and monoracial youth. However, preliminary research suggests this assumption may be untrue. This study compared the prevalence and correlates of substance use among Black, White, and biracial Black-White youth. Data were derived from the National Longitudinal Study of Adolescent and Adult Health (Add Health), which is a longitudinal investigation using stratified random sampling to study health behaviors. After controlling for sociodemographic factors and using weighted Poisson and logistic regression, we found the substance-use prevalence rates of Black-White youth to be intermediate to the higher rates of Whites and lower rates of Blacks. In addition, Black-White youth’s scores on most covariates were intermediate to those of the monoracial groups. Family factors were more important in explaining higher substance use than other contextual factors. School factors seem to be important in explaining lower substance use for Black-White youth. Correlates of substance use for Black-White youth were not identical to those of either Black or White youth. More research on the observed intermediate phenomena among biracial youth vis-à-vis prevalence, correlates, and causes of substance use is needed. PMID:27427812

  10. One-year efficacy and safety of saxagliptin add-on in patients receiving dapagliflozin and metformin.

    PubMed

    Matthaei, S; Aggarwal, N; Garcia-Hernandez, P; Iqbal, N; Chen, H; Johnsson, E; Chin, A; Hansen, L

    2016-11-01

    Greater reductions in glycated haemoglobin (HbA1c) with saxagliptin, a dipeptidyl peptidase-4 inhibitor, versus placebo add-on in patients with type 2 diabetes who had inadequate glycaemic control with dapagliflozin 10 mg/d plus metformin were demonstrated after 24 weeks of treatment. Results over 52 weeks of treatment were assessed in this analysis. Patients (mean baseline HbA1c 7.9%) receiving open-label dapagliflozin 10 mg/d plus metformin were randomized to double-blind saxagliptin 5 mg/d or placebo add-on. The adjusted mean change from baseline to week 52 in HbA1c was greater with saxagliptin than with placebo add-on -0.38% vs 0.05%; difference -0.42% (95% confidence interval -0.64, -0.20)]. More patients achieved the HbA1c target of <7% with saxagliptin than with placebo add-on (29% vs 13%), and fewer patients were rescued or discontinued the study for lack of glycaemic control with saxagliptin than with placebo add-on (19% vs 28%). Reductions from baseline in body weight (≤1.5 kg) occurred in both groups. Similar proportions of patients reported ≥1 adverse event with saxagliptin (58.2%) and placebo add-on (58.0%); no new safety signals were detected. Hypoglycaemia was infrequent in both treatment groups (≤2.5%), with no major episodes. The rate of urinary tract infections was similar in the saxagliptin and placebo add-on groups (7.8% vs 7.4%). The incidence of genital infections was 3.3% with saxagliptin versus 6.2% with placebo add-on. Triple therapy with saxagliptin add-on to dapagliflozin plus metformin for 52 weeks resulted in sustained improvements in glycaemic control without an increase in body weight or increased risk of hypoglycaemia. © 2016 John Wiley & Sons Ltd.

  11. Development of state and transition model assumptions used in National Forest Plan revision

    Treesearch

    Eric B. Henderson

    2008-01-01

    State and transition models are being utilized in forest management analysis processes to evaluate assumptions about disturbances and succession. These models assume valid information about seral class successional pathways and timing. The Forest Vegetation Simulator (FVS) was used to evaluate seral class succession assumptions for the Hiawatha National Forest in...

  12. Questionable Validity of Poisson Assumptions in a Combined Loglinear/MDS Mapping Model.

    ERIC Educational Resources Information Center

    Gleason, John M.

    1993-01-01

    This response to an earlier article on a combined log-linear/MDS model for mapping journals by citation analysis discusses the underlying assumptions of the Poisson model with respect to characteristics of the citation process. The importance of empirical data analysis is also addressed. (nine references) (LRW)

  13. Causal Mediation Analysis: Warning! Assumptions Ahead

    ERIC Educational Resources Information Center

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  14. Developing a cardiopulmonary exercise testing laboratory.

    PubMed

    Diamond, Edward

    2007-12-01

    Cardiopulmonary exercise testing is a noninvasive and cost-effective technique that adds significant value to the assessment and management of a variety of symptoms and diseases. The penetration of this testing in medical practice may be limited by perceived operational and financial barriers. This article reviews coding and supervision requirements related to both simple and complex pulmonary stress testing. A program evaluation and review technique diagram is used to describe the work flow process. Data from our laboratory are used to generate an income statement that separates fixed and variable costs and calculates the contribution margin. A cost-volume-profit (break-even) analysis is then performed. Using data from our laboratory including fixed and variable costs, payer mix, reimbursements by payer, and the assumption that the studies are divided evenly between simple and complex pulmonary stress tests, the break-even number is calculated to be 300 tests per year. A calculator with embedded formulas has been designed by the author and is available on request. Developing a cardiopulmonary exercise laboratory is challenging but achievable and potentially profitable. It should be considered by a practice that seeks to distinguish itself as a quality leader. Providing this clinically valuable service may yield indirect benefits such as increased patient volume and increased utilization of other services provided by the practice. The decision for a medical practice to commit resources to managerial accounting support requires a cost-benefit analysis, but may be a worthwhile investment in our challenging economic environment.

  15. A reconstruction algorithm for three-dimensional object-space data using spatial-spectral multiplexing

    NASA Astrophysics Data System (ADS)

    Wu, Zhejun; Kudenov, Michael W.

    2017-05-01

    This paper presents a reconstruction algorithm for the Spatial-Spectral Multiplexing (SSM) optical system. The goal of this algorithm is to recover the three-dimensional spatial and spectral information of a scene, given that a one-dimensional spectrometer array is used to sample the pupil of the spatial-spectral modulator. The challenge of the reconstruction is that the non-parametric representation of the three-dimensional spatial and spectral object requires a large number of variables, thus leading to an underdetermined linear system that is hard to uniquely recover. We propose to reparameterize the spectrum using B-spline functions to reduce the number of unknown variables. Our reconstruction algorithm then solves the improved linear system via a least- square optimization of such B-spline coefficients with additional spatial smoothness regularization. The ground truth object and the optical model for the measurement matrix are simulated with both spatial and spectral assumptions according to a realistic field of view. In order to test the robustness of the algorithm, we add Poisson noise to the measurement and test on both two-dimensional and three-dimensional spatial and spectral scenes. Our analysis shows that the root mean square error of the recovered results can be achieved within 5.15%.

  16. Prowess - A Software Model for the Ooty Wide Field Array

    NASA Astrophysics Data System (ADS)

    Marthi, Visweshwar Ram

    2017-03-01

    One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted H i emission from z ˜ 3.35. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA H i experiment.

  17. [Estimation of individual breast cancer risk: relevance and limits of risk estimation models].

    PubMed

    De Pauw, A; Stoppa-Lyonnet, D; Andrieu, N; Asselain, B

    2009-10-01

    Several risk estimation models for breast or ovarian cancers have been developed these last decades. All these models take into account the family history, with different levels of sophistication. Gail model was developed in 1989 taking into account the family history (0, 1 or > or = 2 affected relatives) and several environmental factors. In 1990, Claus model was the first to integrate explicit assumptions about genetic effects, assuming a single gene dominantly inherited occurring with a low frequency in the population. BRCAPRO model, posterior to the identification of BRCA1 and BRCA2, assumes a restricted transmission with only these two dominantly inherited genes. BOADICEA model adds the effect of a polygenic component to the effect of BRCA1 and BRCA2 to explain the residual clustering of breast cancer. At last, IBIS model assumes a third dominantly inherited gene to explain this residual clustering. Moreover, this model incorporates environmental factors. We applied the Claus, BRCAPRO, BOADICEA and IBIS models to four clinical situations, corresponding to more or less heavy family histories, in order to study the consistency of the risk estimates. The three more recent models (BRCAPRO, BOADICEA and IBIS) gave the closer estimations. These estimates could be useful in clinical practice in front of complex analysis of breast and/or ovarian cancers family history.

  18. Increasing operating room efficiency through electronic medical record analysis.

    PubMed

    Attaallah, A F; Elzamzamy, O M; Phelps, A L; Ranganthan, P; Vallejo, M C

    2016-05-01

    We used electronic medical record (EMR) analysis to determine errors in operating room (OR) time utilisation. Over a two year period EMR data of 44,503 surgical procedures was analysed for OR duration, on-time, first case, and add-on time performance, within 19 surgical specialties. Maximal OR time utilisation at our institution could have saved over 302,620 min or 5,044 hours of OR efficiency over a two year period. Most specialties (78.95%) had inaccurately scheduled procedure times and therefore used the OR more than their scheduled allotment time. Significant differences occurred between the mean scheduled surgical durations (101.38 ± 87.11 min) and actual durations (108.18 ± 102.27 min; P < 0.001). Significant differences also occurred between the mean scheduled add-on durations (111.4 ± 75.5 min) and the actual add-on scheduled durations (118.6 ± 90.1 minutes; P < 0.001). EMR quality improvement analysis can be used to determine scheduling error and bias, in order to improve efficiency and increase OR time utilisation.

  19. Health and labour force participation of older people in Europe: what do objective health indicators add to the analysis?

    PubMed

    Kalwij, Adriaan; Vermeulen, Frederic

    2008-05-01

    This paper studies labour force participation of older individuals in 11 European countries. The data are drawn from the new Survey of Health, Ageing and Retirement in Europe (SHARE). We examine the value added of objective health indicators in relation to potentially endogenous self-reported health. We approach the endogeneity of self-reported health as an omitted variables problem. In line with the literature on the reliability of self-reported health ambiguous results are obtained. In some countries self-reported health does a fairly good job and controlling for objective health indicators does not add much to the analysis. In other countries, however, the results show that objective health indicators add significantly to the analysis and that self-reported health is endogenous due to omitted objective health indicators. These latter results illustrate the multi-dimensional nature of health and the need to control for objective health indicators when analysing the relation between health status and labour force participation. This makes an instrumental variables approach to deal with the endogeneity of self-reported health less appropriate.

  20. Design data needs modular high-temperature gas-cooled reactor. Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1987-03-01

    The Design Data Needs (DDNs) provide summary statements for program management, of the designer`s need for experimental data to confirm or validate assumptions made in the design. These assumptions were developed using the Integrated Approach and are tabulated in the Functional Analysis Report. These assumptions were also necessary in the analyses or trade studies (A/TS) to develop selections of hardware design or design requirements. Each DDN includes statements providing traceability to the function and the associated assumption that requires the need.

  1. The Impact of Multiple Endpoint Dependency on "Q" and "I"[superscript 2] in Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Christopher Glen; Becker, Betsy Jane

    2014-01-01

    A common assumption in meta-analysis is that effect sizes are independent. When correlated effect sizes are analyzed using traditional univariate techniques, this assumption is violated. This research assesses the impact of dependence arising from treatment-control studies with multiple endpoints on homogeneity measures "Q" and…

  2. Rationality as the Basic Assumption in Explaining Japanese (or Any Other) Business Culture.

    ERIC Educational Resources Information Center

    Koike, Shohei

    Economic analysis, with its explicit assumption that people are rational, is applied to the Japanese and American business cultures to illustrate how the approach is useful for understanding cultural differences. Specifically, differences in cooperative behavior among Japanese and American workers are examined. Economic analysis goes beyond simple…

  3. Formalization and Analysis of Reasoning by Assumption

    ERIC Educational Resources Information Center

    Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…

  4. How biological background assumptions influence scientific risk evaluation of stacked genetically modified plants: an analysis of research hypotheses and argumentations.

    PubMed

    Rocca, Elena; Andersen, Fredrik

    2017-08-14

    Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.

  5. Chance of reimbursement for ADD-ON therapies in Poland and in the world - review of the reimbursement recommendations

    PubMed

    Borowiack, Ewa; Marzec, Magdalena; Nowotarska, Anna; Jarosz, Joanna; Orkisz, Agata; Prząda-Machno, Patrycja

    2018-01-01

    Oncology drugs combined with standard therapies (so-called add-on therapies, e.g. bevacizumab, palbociclib) often receive negative recommendations regarding the legitimacy of public financing, issued by government agencies responsible for their assessment, i.e. health technology assessment agencies. The aim of the study was to estimate the scale of the problem related to the reimbursement of add-on therapies used in the treatment of breast and genitourinary cancers in Poland and in the world. A multimodal approach was used to select add-on therapies. The reimbursement routes were analysed in 8 reference countries (Poland, Canada, England, Wales, France, Scotland, Australia, New Zealand). Based on a systematic search, data for breast and urogenital cancers were included. A total of 68 reimbursement documents for add-on therapies were identified. The analysis showed that in Poland, 20% of innovative schemes including add-on therapies should be reimbursed, while in the world the percentage of positive recommendations reaches 56%. It was observed that globally (including data for Poland) the chance for a favorable reimbursement recommendation for add-on therapies is 53%, with 29% being positive recommendations with limitations. In Poland, the majority of negative recommendations concern genitourinary cancers in comparison to breast cancer (83% vs 75%). Poland is at the head of the countries in terms of the number of negative reimbursement recommendations. Bearing in mind the world’s need of modifying the criteria for the evaluation of oncological therapies in the context of the possibility of their reimbursement, one should expect a change in the approach to the assessment of the legitimacy of financing innovative add-on therapies in Poland.

  6. Evaluation of add-on devices for the prevention of phlebitis and other complications associated with the use of peripheral catheters in hospitalised adults: a randomised controlled study.

    PubMed

    Martínez, J A; Piazuelo, M; Almela, M; Blecua, P; Gallardo, R; Rodríguez, S; Escalante, Z; Robau, M; Trilla, A

    2009-10-01

    The aim of this study was to assess the role of add-on devices for the prevention of phlebitis and other complications associated with the use of peripheral catheters. Patients admitted to an infectious diseases ward and requiring the insertion of a peripheral catheter for at least 24h were randomly allocated to be managed with or without add-on devices. Incidence of phlebitis and all complications were the primary outcomes. Extravasation, inadvertent withdrawal, obstruction and rupture were considered to be mechanical complications, and analysis was performed using survival methods. Of 683 evaluated catheters, 351 were allocated to the add-on device arm and 332 to the control arm. Despite randomisation, patients in the add-on device group were older (P=0.048), less likely to have human immunodeficiency virus (P=0.02) and more likely to have received antibiotics (P=0.05). After adjustment for these variables, the hazard ratio for phlebitis remained non-significant (hazard ratio: 0.95; 95% confidence interval: 0.7-1.3), but the risk of mechanical complications became lower in the add-on device arm (0.68; 0.5-0.94). This translated into a trend towards a lower risk of any complication (0.83; 0.67-1.01). The beneficial effect on mechanical or all complications was noticeable after six days of catheterisation. Add-on devices do not reduce the incidence of phlebitis but may prevent mechanical complications. However, the impact of add-on devices on the incidence of all complications is at most small and only apparent after the sixth day of catheter use.

  7. Methods of separation of variables in turbulence theory

    NASA Technical Reports Server (NTRS)

    Tsuge, S.

    1978-01-01

    Two schemes of closing turbulent moment equations are proposed both of which make double correlation equations separated into single-point equations. The first is based on neglected triple correlation, leading to an equation differing from small perturbed gasdynamic equations where the separation constant appears as the frequency. Grid-produced turbulence is described in this light as time-independent, cylindrically-isotropic turbulence. Application to wall turbulence guided by a new asymptotic method for the Orr-Sommerfeld equation reveals a neutrally stable mode of essentially three dimensional nature. The second closure scheme is based on an assumption of identity of the separated variables through which triple and quadruple correlations are formed. The resulting equation adds, to its equivalent of the first scheme, an integral of nonlinear convolution in the frequency describing a role due to triple correlation of direct energy-cascading.

  8. Reply [to “Comments on “Which one is correct, 2000 or 2001? How about 1995?’”

    NASA Astrophysics Data System (ADS)

    Veronis, George

    Reply to Randalls comment: The millennium argument occurs mostly because people ask different questions, one based on numerology (start from 1, add 2000 and get 2001), and the other based on measuring years from the time that Christ was born. The people who base their arguments on numerology invariably refer to the absence of a year zero. That suffices to get 2001. If that is the issue, then there is no arguing against 2001. But in my mind, that is not the issue. So John Randall and I are answering different questions. The arguments used by the numerologists are based on the assumption that the rest of us don't know how to count. They really ought to consider the fact that a different question may be more to the point.

  9. Weighing In: The "Evidence of Experience" and Canadian Fat Women's Activism.

    PubMed

    Ellison, Jenny

    2013-01-01

    This article adds historical dimension to the developing literature on "obesity stigma": negative treatment and discrimination experienced as a consequence of the belief that overweight people are lazy and lacking willpower and basic knowledge about nutrition. Interviews with women who identified as fat suggest that medical and cultural concern about weight was conflated in their interactions with doctors, peers, and family. Stigma was a cause of frustration and despair for those deemed obese, who felt that unfair assumptions were made about their lifestyle and their abilities. In response, the women interviewed formed organizations, exercise classes, and social activities for "fat women only." Fat activists offer unique insight, because their work sheds light not only on the impact of obesity stigma but also on how some women responded to and resisted the medicalization and objectification of their bodies.

  10. The Individual and Family Self-Management Theory: background and perspectives on context, process, and outcomes.

    PubMed

    Ryan, Polly; Sawin, Kathleen J

    2009-01-01

    Current evidence indicates that individuals and families who engage in self-management (SM) behaviors improve their health outcomes. While the results of these studies are promising, there is little agreement as to the critical components of SM or directions for future study. This article offers an organized perspective of similar and divergent ideas related to SM. Unique contributions of prior work are highlighted and findings from studies are summarized. A new descriptive mid-range theory, Individual and Family Self-management Theory, is presented; assumptions are identified, concepts defined, and proposed relationships are outlined. This theory adds to the literature on SM by focusing on individuals, dyads within the family, or the family unit as a whole; explicating process components of SM; and proposing use of proximal and distal outcomes.

  11. The Individual and Family Self-management Theory: Background and Perspectives on Context, Process, and Outcomes

    PubMed Central

    Ryan, Polly; Sawin, Kathleen J.

    2009-01-01

    Current evidence indicates that individuals and families who engage in self-management (SM) behaviors improve their health outcomes. While the results of these studies are promising, there is little agreement as to the critical components of SM or directions for future study. This paper offers an organized perspective of similar and divergent ideas related to SM. Unique contributions of prior work are highlighted and findings from studies are summarized. A new descriptive mid-range theory, Individual and Family Self-management Theory, is presented; assumptions identified, concepts defined, and proposed relationships outlined. This theory adds to the literature on self-management by focusing on individual, dyads within the family, or the family unit as a whole; explicating process components of self-management; and proposing use of proximal and distal outcomes. PMID:19631064

  12. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  13. Technical Communications in Aeronautics: Results of an Exploratory Study. An Analysis of Managers' and Nonmanagers' Responses. NASA Technical Memorandum 101625.

    ERIC Educational Resources Information Center

    Pinelli, Thomas E.; And Others

    Data collected from an exploratory study concerned with the technical communications practices of aerospace engineers and scientists were analyzed to test the primary assumption that aerospace managers and nonmanagers have different technical communications practices. Five secondary assumptions were established for the analysis: (1) that the…

  14. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    ERIC Educational Resources Information Center

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  15. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    PubMed

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the age of five in a household, number of births in the past 5 years, wealth index, total number of children ever born and the child's birth order. The results further indicated that the predictive performance for random survival forests built using covariates including those that violate the PH assumption was higher than that for random survival forests built using only covariates that satisfy the PH assumption. Random survival forests are appealing methods in analysing public health data to understand factors strongly associated with under-five child mortality rates especially in the presence of covariates that violate the proportional hazards assumption.

  16. Memantine add-on to antipsychotic treatment for residual negative and cognitive symptoms of schizophrenia: a meta-analysis.

    PubMed

    Kishi, Taro; Matsuda, Yuki; Iwata, Nakao

    2017-07-01

    We examined whether memantine add-on to antipsychotic treatment is beneficial in schizophrenia treatment. This systematic review and meta-analysis aimed to achieve stronger evidence on the efficacy and safety of memantine add-on for treating schizophrenia. We analyzed double-blind, randomized, placebo-controlled trials of memantine add-on treatment in schizophrenia patients receiving antipsychotics. The primary outcomes were amelioration of negative symptoms and all-cause discontinuation. Dichotomous outcomes are presented as risk ratios (RRs), and continuous outcomes are presented as mean differences (MDs) or standardized mean differences (SMDs). Eight studies (n = 448) were included. Although memantine add-on treatment was superior to placebo for ameliorating negative symptoms (SMD = -0.96, p = 0.006, I 2  = 88%; N = 7, n = 367) in the Positive and Negative Syndrome Scale general subscale (MD = -1.62, p = 0.002, I 2  = 0%; N = 4, n = 151) and Mini-Mental Status Examination score (MD = -3.07, p < 0.0001, I 2  = 21%; N = 3, n = 83), there were no statistically significant differences in the amelioration of overall (SMD = -0.75, p = 0.06, I 2  = 86%; N = 5, n = 271), positive (SMD = -0.46, p = 0.07, I 2  = 80%; N = 7, n = 367), and depressive symptoms (SMD = -0.127, p = 0.326, I 2  = 0%; N = 4, n = 201); all-cause discontinuation (RR = 1.34, p = 0.31, I 2  = 0%; N = 8, n = 448); and individual adverse events (fatigue, dizziness, headache, nausea, constipation) between the groups. For negative symptoms, the significant heterogeneity disappeared when risperidone studies alone were considered (I 2  = 0%). However, memantine add-on treatment remained superior to placebo (SMD = -1.29, p = 0.00001). Meta-regression analysis showed that patient age was associated with memantine-associated amelioration of negative symptoms (slope = 0.171, p = 0.0206). Memantine add-on treatment may be beneficial for treating psychopathological symptoms (especially negative symptoms) in schizophrenia patients. The negative-symptom effect size may be associated with younger adult schizophrenia patients.

  17. Measurement of Gamma-Irradiated Corneal Patch Graft Thickness After Aqueous Drainage Device Surgery.

    PubMed

    de Luna, Regina A; Moledina, Ameera; Wang, Jiangxia; Jampel, Henry D

    2017-09-01

    Exposure of the tube of an aqueous drainage device (ADD) through the conjunctiva is a serious complication of ADD surgery. Although placement of gamma-irradiated sterile cornea (GISC) as a patch graft over the tube is commonly performed, exposures still occur. To measure GISC patch graft thickness as a function of time after surgery, estimate the rate of graft thinning, and determine risk factors for graft thinning. Cross-sectional study of graft thickness using anterior segment optic coherence tomography (AS-OCT) was conducted at the Wilmer Eye Institute at Johns Hopkins Hospital. A total of 107 patients (120 eyes, 120 ADDs) 18 years or older who underwent ADD surgery at Johns Hopkins with GISC patch graft between July 1, 2010, and October 31, 2016, were enrolled. Implantation of ADD with placement of GISC patch graft over the tube. Graft thickness vs time after ADD surgery and risk factors for undetectable graft. Of the 107 patients included in the analysis, the mean (SD) age of the cohort was 64 (16.2) years, 49 (45.8%) were male, and 43 (40.2%) were African American. The mean time of measurement after surgery was 1.7 years (range, 1 day to 6 years). Thinner grafts were observed as the time after surgery lengthened (β regression coefficient, -60 µm per year since surgery; 95% CI, -80 µm to -40 µm). The odds ratio of undetectable grafts per year after ADD surgery was 2.1 (95% CI, 1.5-3.0; P < .001). Age, sex, race, type of ADD, quadrant of ADD placement, diagnosis of uveitis or dry eye, and prior conjunctival surgery were not correlated with the presence or absence of the graft. Gamma-irradiated sterile corneal patch grafts do not always retain their integrity after ADD surgery. Data from this cross-sectional study showed that on average, the longer the time after surgery, the thinner the graft. These findings suggest that placement of a GISC patch graft is no guarantee against tube exposure, and that better strategies are needed for preventing this complication.

  18. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    PubMed

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  19. Mortality of Dandy-Walker syndrome in the United States: Analysis by race, gender, and insurance status

    PubMed Central

    McClelland, Shearwood; Ukwuoma, Onyinyechi I.; Lunos, Scott; Okuyemi, Kolawole S.

    2015-01-01

    Background: Dandy-Walker syndrome (DWS) is a congenital disorder often diagnosed in early childhood. Typically manifesting with signs/symptoms of increased intracranial pressure, DWS is catastrophic unless timely neurosurgical care can be administered via cerebrospinal fluid (CSF) drainage. The rates of mortality, adverse discharge disposition (ADD), and CSF drainage in DWS may not be uniform regardless of race, gender or insurance status; such differences could reflect disparities in access to neurosurgical care. This study examines these issues on a nationwide level. Materials and Methods: The Kids’ Inpatient Database spanning 1997-2003 was used for analysis. Only patients admitted for DWS (ICD-9-CM = 742.3) were included. Multivariate analysis was adjusted for several variables, including patient age, race, sex, admission type, primary payer, income, and hospital volume. Results: More than 14,000 DWS patients were included. Increasing age predicted reduced mortality (OR = 0.87; P < 0.05), ADD (OR = 0.96; P < 0.05), and decreased likelihood of receiving CSF drainage (OR = 0.86; P < 0.0001). Elective admission type predicted reduced mortality (OR = 0.29; P = 0.0008), ADD (OR = 0.68; P < 0.05), and increased CSF drainage (OR = 2.02; P < 0.0001). African-American race (OR = 1.20; P < 0.05) and private insurance (OR = 1.18; P < 0.05) each predicted increased likelihood of receiving CSF drainage, but were not predictors of mortality or ADD. Gender, income, and hospital volume were not significant predictors of DWS outcome. Conclusion: Increasing age and elective admissions each decrease mortality and ADD associated with DWS. African-American race and private insurance status increase access to CSF drainage. These findings contradict previous literature citing African-American race as a risk factor for mortality in DWS, and emphasize the role of private insurance in obtaining access to potentially lifesaving operative care. PMID:25883477

  20. Dynamic Network-Based Epistasis Analysis: Boolean Examples

    PubMed Central

    Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.

    2011-01-01

    In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and single-path assumption, but also by demonstrating the importance of considering temporal dynamics, and specifically introducing the usefulness of Boolean network models and also reviewing some key properties of network approaches. PMID:22645556

  1. Project Air Force, Annual Report 2003

    DTIC Science & Technology

    2003-01-01

    to Simulate Personnel Retention The CAPM system is based on a simple assumption about employee retention: A rational individual faced with the...analysis to certain parts of the force. CAPM keeps a complete record of the assumptions , policies, and data used for each scenario. Thus decisionmakers...premises and assumptions . Instead, the Commission concluded that space is a separate oper- ating arena equivalent to the air, land, and maritime

  2. Local linear discriminant analysis framework using sample neighbors.

    PubMed

    Fan, Zizhu; Xu, Yong; Zhang, David

    2011-07-01

    The linear discriminant analysis (LDA) is a very popular linear feature extraction approach. The algorithms of LDA usually perform well under the following two assumptions. The first assumption is that the global data structure is consistent with the local data structure. The second assumption is that the input data classes are Gaussian distributions. However, in real-world applications, these assumptions are not always satisfied. In this paper, we propose an improved LDA framework, the local LDA (LLDA), which can perform well without needing to satisfy the above two assumptions. Our LLDA framework can effectively capture the local structure of samples. According to different types of local data structure, our LLDA framework incorporates several different forms of linear feature extraction approaches, such as the classical LDA and principal component analysis. The proposed framework includes two LLDA algorithms: a vector-based LLDA algorithm and a matrix-based LLDA (MLLDA) algorithm. MLLDA is directly applicable to image recognition, such as face recognition. Our algorithms need to train only a small portion of the whole training set before testing a sample. They are suitable for learning large-scale databases especially when the input data dimensions are very high and can achieve high classification accuracy. Extensive experiments show that the proposed algorithms can obtain good classification results.

  3. In the Opponent's Shoes: Increasing the Behavioral Validity of Attackers' Judgments in Counterterrorism Models.

    PubMed

    Sri Bhashyam, Sumitra; Montibeller, Gilberto

    2016-04-01

    A key objective for policymakers and analysts dealing with terrorist threats is trying to predict the actions that malicious agents may take. A recent trend in counterterrorism risk analysis is to model the terrorists' judgments, as these will guide their choices of such actions. The standard assumptions in most of these models are that terrorists are fully rational, following all the normative desiderata required for rational choices, such as having a set of constant and ordered preferences, being able to perform a cost-benefit analysis of their alternatives, among many others. However, are such assumptions reasonable from a behavioral perspective? In this article, we analyze the types of assumptions made across various counterterrorism analytical models that represent malicious agents' judgments and discuss their suitability from a descriptive point of view. We then suggest how some of these assumptions could be modified to describe terrorists' preferences more accurately, by drawing knowledge from the fields of behavioral decision research, politics, philosophy of choice, public choice, and conflict management in terrorism. Such insight, we hope, might help make the assumptions of these models more behaviorally valid for counterterrorism risk analysis. © 2016 The Authors Wound Repair and Regeneration published by Wiley Periodicals, Inc. on behalf of The Wound Healing Society.

  4. Stereovision Imaging in Smart Mobile Phone Using Add on Prisms

    NASA Astrophysics Data System (ADS)

    Bar-Magen Numhauser, Jonathan; Zalevsky, Zeev

    2014-03-01

    In this work we present the use of a prism-based add on component installed on top of a smart phone to achieve stereovision capabilities using iPhone mobile operating system. Through these components and the combination of the appropriate application programming interface and mathematical algorithms the obtained results will permit the analysis of possible enhancements for new uses to such system, in a variety of areas including medicine and communications.

  5. Transient thermal analysis of fluid systems

    NASA Technical Reports Server (NTRS)

    Chandler, G. D.; Trust, R. D.

    1977-01-01

    Computer program performs transient thermal analysis of any 2-node to 200-node-thermal network, which transports heat by fluid flow convection. Program can be modified to add conduction along tubes and radiation.

  6. Sensitivity analysis of the add-on price estimate for the silicon web growth process

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1981-01-01

    The web growth process, a silicon-sheet technology option, developed for the flat plate solar array (FSA) project, was examined. Base case data for the technical and cost parameters for the technical and commercial readiness phase of the FSA project are projected. The process add on price, using the base case data for cost parameters such as equipment, space, direct labor, materials and utilities, and the production parameters such as growth rate and run length, using a computer program developed specifically to do the sensitivity analysis with improved price estimation are analyzed. Silicon price, sheet thickness and cell efficiency are also discussed.

  7. Quantification of material state using reflectance FTIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Criner, Amanda K.; Henry, Christine; Imel, Megan; King, Derek

    2018-04-01

    A common, frequently violated, assumption implicit in many data analysis techniques is that data is of the same quality across observations. The effect of this assumption is discussed and demonstrated in the example of FTIR of CMCs. An alternative analysis, which incorporates the variation in the quality of the data, is presented. A comparison between the analyses is used to demonstrate this difference.

  8. Consequences of Assumption Violations Revisited: A Quantitative Review of Alternatives to the One-Way Analysis of Variance "F" Test.

    ERIC Educational Resources Information Center

    Lix, Lisa M.; And Others

    1996-01-01

    Meta-analytic techniques were used to summarize the statistical robustness literature on Type I error properties of alternatives to the one-way analysis of variance "F" test. The James (1951) and Welch (1951) tests performed best under violations of the variance homogeneity assumption, although their use is not always appropriate. (SLD)

  9. PIV Measurements of Supersonic Internally-Mixed Dual-Stream Jets

    NASA Technical Reports Server (NTRS)

    Bridges, James E.; Wernet, Mark P.

    2012-01-01

    While externally mixed, or separate flow, nozzle systems are most common in high bypass-ratio aircraft, they are not as attractive for use in lower bypass-ratio systems and on aircraft that will fly supersonically. The noise of such propulsion systems is also dominated by jet noise, making the study and noise reduction of these exhaust systems very important, both for military aircraft and future civilian supersonic aircraft. This paper presents particle image velocimetry of internally mixed nozzle with different area ratios between core and bypass, and nozzles that are ideally expanded and convergent. Such configurations independently control the geometry of the internal mixing layer and of the external shock structure. These allow exploration of the impact of shocks on the turbulent mixing layers, the impact of bypass ratio on broadband shock noise and mixing noise, and the impact of temperature on the turbulent flow field. At the 2009 AIAA/CEAS Aeroacoustics Conference the authors presented data and analysis from a series of tests that looked at the acoustics of supersonic jets from internally mixed nozzles. In that paper the broadband shock and mixing noise components of the jet noise were independently manipulated by holding Mach number constant while varying bypass ratio and jet temperature. Significant portions of that analysis was predicated on assumptions regarding the flow fields of these jets, both shock structure and turbulence. In this paper we add to that analysis by presenting particle image velocimetry measurements of the flow fields of many of those jets. In addition, the turbulent velocity data documented here will be very useful for validation of computational flow codes that are being developed to design advanced nozzles for future aircraft.

  10. Parameter-expanded data augmentation for Bayesian analysis of capture-recapture models

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, Robert M.

    2012-01-01

    Data augmentation (DA) is a flexible tool for analyzing closed and open population models of capture-recapture data, especially models which include sources of hetereogeneity among individuals. The essential concept underlying DA, as we use the term, is based on adding "observations" to create a dataset composed of a known number of individuals. This new (augmented) dataset, which includes the unknown number of individuals N in the population, is then analyzed using a new model that includes a reformulation of the parameter N in the conventional model of the observed (unaugmented) data. In the context of capture-recapture models, we add a set of "all zero" encounter histories which are not, in practice, observable. The model of the augmented dataset is a zero-inflated version of either a binomial or a multinomial base model. Thus, our use of DA provides a general approach for analyzing both closed and open population models of all types. In doing so, this approach provides a unified framework for the analysis of a huge range of models that are treated as unrelated "black boxes" and named procedures in the classical literature. As a practical matter, analysis of the augmented dataset by MCMC is greatly simplified compared to other methods that require specialized algorithms. For example, complex capture-recapture models of an augmented dataset can be fitted with popular MCMC software packages (WinBUGS or JAGS) by providing a concise statement of the model's assumptions that usually involves only a few lines of pseudocode. In this paper, we review the basic technical concepts of data augmentation, and we provide examples of analyses of closed-population models (M 0, M h , distance sampling, and spatial capture-recapture models) and open-population models (Jolly-Seber) with individual effects.

  11. A Longitudinal Study on Human Outdoor Decomposition in Central Texas.

    PubMed

    Suckling, Joanna K; Spradley, M Katherine; Godde, Kanya

    2016-01-01

    The development of a methodology that estimates the postmortem interval (PMI) from stages of decomposition is a goal for which forensic practitioners strive. A proposed equation (Megyesi et al. 2005) that utilizes total body score (TBS) and accumulated degree days (ADD) was tested using longitudinal data collected from human remains donated to the Forensic Anthropology Research Facility (FARF) at Texas State University-San Marcos. Exact binomial tests examined the rate of the equation to successfully predict ADD. Statistically significant differences were found between ADD estimated by the equation and the observed value for decomposition stage. Differences remained significant after carnivore scavenged donations were removed from analysis. Low success rates for the equation to predict ADD from TBS and the wide standard errors demonstrate the need to re-evaluate the use of this equation and methodology for PMI estimation in different environments; rather, multivariate methods and equations should be derived that are environmentally specific. © 2015 American Academy of Forensic Sciences.

  12. [NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 3:] Technical communications in aeronautics: Results of an exploratory study. An analysis of profit managers' and nonprofit managers' responses

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Myron; Barclay, Rebecca O.; Oliu, Walter E.

    1989-01-01

    Data collected from an exploratory study concerned with the technical communications practices of aerospace engineers and scientists were analyzed to test the primary assumption that profit and nonprofit managers in the aerospace community have different technical communications practices. Five assumptions were established for the analysis. Profit and nonprofit managers in the aerospace community were found to have different technical communications practices for one of the five assumptions tested. It was, therefore, concluded that profit and nonprofit managers in the aerospace community do not have different technical communications practices.

  13. Is mom in charge? Implications of resource provisioning on the evolution of the placenta.

    PubMed

    Banet, Amanda I; Au, Arthur G; Reznick, David N

    2010-11-01

    The Trexler-DeAngelis model shows that placentas are most likely to evolve in environments with consistent, high levels of resource availability. An assumption imperative to the model is that placental species abort embryos in low food conditions. However, a previous experimental test of this assumption using the northern clade of Poeciliopsis showed no evidence for abortion. To distinguish between the alternatives that placental species either sacrifice body condition to maintain reproduction when resources are restricted, or that the previously documented pattern of resource allocation is a function of other life-history correlates of placentation rather than placentation alone, we perform a similar experiment on the southern clade of Poeciliopsis. The southern clade has the opposite relationship between life-history traits and placentation as seen in the northern clade. Our results mirror those from the northern clade, indicating that reproductive mode, rather than life history, dictates the pattern of resource allocation. These results add to the difficulties of explaining placental evolution within the constraints of the Trexler-DeAngelis model by restricting the range of resource conditions in which placental species can outcompete nonplacental species. They also lend support to hypotheses that suggest parent-offspring conflict in utero drives the evolution of the placenta. © 2010 The Author(s). Evolution© 2010 The Society for the Study of Evolution.

  14. How many people can China support?

    PubMed

    Mu, G

    1999-10-01

    Dr. Mu Guangzong, associate professor of the People's University of China, disagrees with the assumption that China can only sustain up to 1.6 billion people. This estimate was concluded by a group of researchers from the Chinese Academy of Sciences and 70 other institutions in their study conducted in the late 1980s. Based on the hypothesis that China can produce 830 million tons of grain at maximum, the researchers concluded that the region is able to support 1.66 billion people (assuming 500-550 kg/person/year). However, Dr. Guangzong says that this assumption seriously underestimates China's capabilities. He says that the country can support up to 2.075 billion people, assuming the land can produce 830 million tons of grain at maximum. A further explanation indicates that in order to live a person needs 213 kg of grain, 25 kg of meat, 10 kg of eggs, 6 kg of vegetables, and 8 kg of vegetable oil and sugar each. All these add up to 390-400 kg of grain. In addition, both per capita consumption figures and land productivity are variables subject to technological advances, and there are other sources of food other than the land resources. However, economic development is not just about feeding the population, it is also about providing decent living standards to them. Thus, control of population growth is still important for the country.

  15. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  16. Analysis of the value of battery storage with wind and photovoltaic generation to the Sacramento Municipal Utility District

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaininger, H.W.

    1998-08-01

    This report describes the results of an analysis to determine the economic and operational value of battery storage to wind and photovoltaic (PV) generation technologies to the Sacramento Municipal Utility District (SMUD) system. The analysis approach consisted of performing a benefit-cost economic assessment using established SMUD financial parameters, system expansion plans, and current system operating procedures. This report presents the results of the analysis. Section 2 describes expected wind and PV plant performance. Section 3 describes expected benefits to SMUD associated with employing battery storage. Section 4 presents preliminary benefit-cost results for battery storage added at the Solano wind plantmore » and the Hedge PV plant. Section 5 presents conclusions and recommendations resulting from this analysis. The results of this analysis should be reviewed subject to the following caveat. The assumptions and data used in developing these results were based on reports available from and interaction with appropriate SMUD operating, planning, and design personnel in 1994 and early 1995 and are compatible with financial assumptions and system expansion plans as of that time. Assumptions and SMUD expansion plans have changed since then. In particular, SMUD did not install the additional 45 MW of wind that was planned for 1996. Current SMUD expansion plans and assumptions should be obtained from appropriate SMUD personnel.« less

  17. Assessing Gaussian Assumption of PMU Measurement Error Using Field Data

    DOE PAGES

    Wang, Shaobu; Zhao, Junbo; Huang, Zhenyu; ...

    2017-10-13

    Gaussian PMU measurement error has been assumed for many power system applications, such as state estimation, oscillatory modes monitoring, voltage stability analysis, to cite a few. This letter proposes a simple yet effective approach to assess this assumption by using the stability property of a probability distribution and the concept of redundant measurement. Extensive results using field PMU data from WECC system reveal that the Gaussian assumption is questionable.

  18. Current economic and sensitivity analysis for ID slicing of 4 inch and 6 inch diameter silicon ingots for photovoltaic applications

    NASA Technical Reports Server (NTRS)

    Roberts, E. G.; Johnson, C. M.

    1982-01-01

    The economics and sensitivities of slicing large diameter silicon ingots for photovoltaic applications were examined. Current economics and slicing add on cost sensitivities are calculated using variable parameters for blade life, slicing yield, and slice cutting speed. It is indicated that cutting speed has the biggest impact on slicing add on cost, followed by slicing yield, and by blade life as the blade life increases.

  19. Children Writing "Hard Times": Lived Experiences of Poverty and the Class-Privileged Assumptions of a Mandated Curriculum

    ERIC Educational Resources Information Center

    Dutro, Elizabeth

    2009-01-01

    Dutro discusses an analysis of the disconnect between the material realities of the lives of a group of third-grade children living in poverty and the middle-class assumptions of a district-mandated unit within a literacy curriculum. The analysis arose in the context of an ethnographic study of identity and classroom literacy practices; it was…

  20. The impact of management science on political decision making

    NASA Technical Reports Server (NTRS)

    White, M. J.

    1971-01-01

    The possible impact on public policy and organizational decision making of operations research/management science (OR/MS) is discussed. Criticisms based on the assumption that OR/MS will have influence on decision making and criticisms based on the assumption that it will have no influence are described. New directions in the analysis of analysis and in thinking about policy making are also considered.

  1. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  2. Influence of Meibomian Gland Dysfunction and Friction-Related Disease on the Severity of Dry Eye.

    PubMed

    Vu, Chi Hoang Viet; Kawashima, Motoko; Yamada, Masakazu; Suwaki, Kazuhisa; Uchino, Miki; Shigeyasu, Chika; Hiratsuka, Yoshimune; Yokoi, Norihiko; Tsubota, Kazuo

    2018-02-16

    To evaluate the effect of meibomian gland dysfunction (MGD) and friction-related disease (FRD) on the severity of dry eye disease (DED). Cross-sectional observational study. This study enrolled 449 patients with DED (63 men and 386 women; mean age, 62.6±15.7 years [range, 21-90 years]) for analysis. Subjective symptoms, the ocular surface, tear function, and the presence of MGD and FRD (superior limbic keratoconjunctivitis, conjunctivochalasis, and lid wiper epitheliopathy) were investigated. Schirmer value, tear film breakup time (TBUT), and keratoconjunctival score. We classified the participants into aqueous-deficient dry eye (ADDE; n = 231 [51.4%]) and short TBUT dry eye subtype (TBUT-DE; n = 109 [24.3%]) subgroups. The TBUT was shorter in patients with MGD than in those without MGD, whereas other ocular signs showed no difference (TBUT: MGD present, 1.97±1.02 seconds; MGD absent, 2.94±1.63 seconds [P < 0.001]; ADDE/MGD present, 1.94±1.08 seconds; ADDE/MGD absent, 2.77±1.61 seconds [P < 0.001]; short TBUT-DE/MGD present, 2.07±0.97 seconds; short TBUT-DE/MGD absent, 2.94±1.23 seconds [P = 0.01]). The ADDE patients with FRD showed a worse TBUT than ADDE patients without FRD (TBUT: ADDE/FRD present, 2.08±1.39 seconds; ADDE/FRD absent, 2.92±1.54 seconds; P < 0.001). This study showed associations between MGD, FRD, or both and ocular signs in DED. In the presence of MGD, FRD, or both, TBUT was significantly shortened regardless of the dry eye status or subtype. Copyright © 2018 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  3. Levels of Simplification. The Use of Assumptions, Restrictions, and Constraints in Engineering Analysis.

    ERIC Educational Resources Information Center

    Whitaker, Stephen

    1988-01-01

    Describes the use of assumptions, restrictions, and constraints in solving difficult analytical problems in engineering. Uses the Navier-Stokes equations as examples to demonstrate use, derivations, advantages, and disadvantages of the technique. (RT)

  4. Theta and Alpha Oscillations in Attentional Interaction during Distracted Driving

    PubMed Central

    Wang, Yu-Kai; Jung, Tzyy-Ping; Lin, Chin-Teng

    2018-01-01

    Performing multiple tasks simultaneously usually affects the behavioral performance as compared with executing the single task. Moreover, processing multiple tasks simultaneously often involve more cognitive demands. Two visual tasks, lane-keeping task and mental calculation, were utilized to assess the brain dynamics through 32-channel electroencephalogram (EEG) recorded from 14 participants. A 400-ms stimulus onset asynchrony (SOA) factor was used to induce distinct levels of attentional requirements. In the dual-task conditions, the deteriorated behavior reflected the divided attention and the overlapping brain resources used. The frontal, parietal and occipital components were decomposed by independent component analysis (ICA) algorithm. The event- and response-related theta and alpha oscillations in selected brain regions were investigated first. The increased theta oscillation in frontal component and decreased alpha oscillations in parietal and occipital components reflect the cognitive demands and attentional requirements as executing the designed tasks. Furthermore, time-varying interactive over-additive (O-Add), additive (Add) and under-additive (U-Add) activations were explored and summarized through the comparison between the summation of the elicited spectral perturbations in two single-task conditions and the spectral perturbations in the dual task. Add and U-Add activations were observed while executing the dual tasks. U-Add theta and alpha activations dominated the posterior region in dual-task situations. Our results show that both deteriorated behaviors and interactive brain activations should be comprehensively considered for evaluating workload or attentional interaction precisely. PMID:29479310

  5. A near-optimum procedure for selecting stations in a streamgaging network

    USGS Publications Warehouse

    Lanfear, Kenneth J.

    2005-01-01

    Two questions are fundamental to Federal government goals for a network of streamgages which are operated by the U.S. Geological Survey: (1) how well does the present network of streamagaging stations meet defined Federal goals and (2) what is the optimum set of stations to add or reactivate to support remaining goals? The solution involves an incremental-stepping procedure that is based on Basic Feasible Incremental Solutions (BFIS?s) where each BFIS satisfies at least one Federal streamgaging goal. A set of minimum Federal goals for streamgaging is defined to include water measurements for legal compacts and decrees, flooding, water budgets, regionalization of streamflow characteristics, and water quality. Fully satisfying all these goals by using the assumptions outlined in this paper would require adding 887 new streamgaging stations to the U.S. Geological Survey network and reactivating an additional 857 stations that are currently inactive.

  6. Fleet retrofit report

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Flight tests are evaluated of an avionics system which aids the pilot in making two-segment approaches for noise abatement. The implications are discussed of equipping United's fleet of Boeing 727-200 aircraft with two-segment avionics for use down to Category 2 weather operating minima. The experience is reported of incorporating two-segment approach avionics systems on two different aircraft. The cost of installing dual two-segment approach systems is estimated to be $37,015 per aircraft, including parts, labor, and spares. This is based on the assumption that incremental out-of-service and training costs could be minimized by incorporating the system at airframe overhaul cycle and including training in regular recurrent training. Accelerating the modification schedule could add up to 50 percent to the modification costs. Recurring costs of maintenance of the installation are estimated to be of about the same magnitude as the potential recurrent financial benefits due to fuel savings.

  7. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  8. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  9. Childhood leukemia and cancers near German nuclear reactors: significance, context, and ramifications of recent studies.

    PubMed

    Nussbaum, Rudi H

    2009-01-01

    A government-sponsored study of childhood cancer in the proximity of German nuclear power plants (German acronym KiKK) found that children < 5 years living < 5 km from plant exhaust stacks had twice the risk for contracting leukemia as those residing > 5 km. The researchers concluded that since "this result was not to be expected under current radiation-epidemiological knowledge" and confounders could not be identified, the observed association of leukemia incidence with residential proximity to nuclear plants "remains unexplained." This unjustified conclusion illustrates the dissonance between evidence and assumptions. There exist serious flaws and gaps in the knowledge on which accepted models for population exposure and radiation risk are based. Studies with results contradictory to those of KiKK lack statistical power to invalidate its findings. The KiKK study's ramifications add to the urgency for a public policy debate regarding the health impact of nuclear power generation.

  10. Forensic-paternity effectiveness and genetics population analysis of six non-CODIS mini-STR loci (D1S1656, D2S441, D6S1043, D10S1248, D12S391, D22S1045) and SE33 in Mestizo and Amerindian populations from Mexico.

    PubMed

    Burguete-Argueta, Nelsi; Martínez De la Cruz, Braulio; Camacho-Mejorado, Rafael; Santana, Carla; Noris, Gino; López-Bayghen, Esther; Arellano-Galindo, José; Majluf-Cruz, Abraham; Antonio Meraz-Ríos, Marco; Gómez, Rocío

    2016-11-01

    STRs are powerful tools intensively used in forensic and kinship studies. In order to assess the effectiveness of non-CODIS genetic markers in forensic and paternity tests, the genetic composition of six mini short tandem repeats-mini-STRs-(D1S1656, D2S441, D6S1043, D10S1248, D12S391, D22S1045) and the microsatellite SE33 in Mestizo and Amerindian populations from Mexico were studied. Using multiplex polymerase chain reactions and capillary electrophoresis, this study genotyped all loci from 870 chromosomes and evaluated the statistical genetic parameters. All mini-STRs studied were in agreement with HW and linkage equilibrium; however, an important HW departure for SE33 was found in the Mestizo population (p ≤ 0.0001). Regarding paternity and forensic statistical parameters, high values of combined power discrimination and mean power of exclusion were found using these seven markers. The principal co-ordinate analysis based on allele frequencies of three mini-STRs showed the complex genetic architecture of the Mestizo population. The results indicate that this set of loci is suitable to genetically identify individuals in the Mexican population, supporting its effectiveness in human identification casework. In addition, these findings add new statistical values and emphasise the importance of the use of non-CODIS markers in complex populations in order to avoid erroneous assumptions.

  11. More on the alleged 1970 geomagnetic jerk

    USGS Publications Warehouse

    Alldredge, L.R.

    1985-01-01

    French and United Kingdom workers have published reports describing a sudden change in the secular acceleration, called an impulse or a jerk, which took place around 1970. They claim that this change took place in a period of a year or two and that the sources of the alleged jerk are internal. An earlier paper by this author questioned their method of analysis pointing out that their method of piecemeal fitting of parabolas to the data will always create a discontinuity in the secular acceleration where the parabolas join and that the place where the parabolas join is an a priori assumption and not a result of the analysis. This paper gives a very brief summary of this first paper and then adds additional reasons for questioning the allegation that there was a worldwide sudden jerk in the magnetic field of internal origin around 1970. These new reasons are based largely on new field models which give cubic approximations of the field right through the 1970 timeframe and therefore have no discontinuities in the second derivative (jerk) around 1970. Some recent Japanese work shows several sudden changes in the secular variation pattern which cover limited areas and do not seem to be closely related to each other or to the irregularity noted in the European area near 1970. The secular variation picture which seems to be emerging is one with many local or limited-regional secular variation changes which appear to be almost unrelated to each other in time or space. A worldwide spherical harmonic model including coefficients up to degree 13 could never properly depict such a situation. ?? 1985.

  12. Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

    PubMed Central

    Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip

    2011-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561

  13. 40 CFR 264.99 - Compliance monitoring program.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the availability of laboratory facilities to perform the analysis of ground-water samples. (e) The... Administrator, and repeat the analysis. If the second analysis confirms the presence of new constituents, the... Administrator within seven days after the completion of the second analysis and add them to the monitoring list...

  14. Causality links among renewable energy consumption, CO2 emissions, and economic growth in Africa: evidence from a panel ARDL-PMG approach.

    PubMed

    Attiaoui, Imed; Toumi, Hassen; Ammouri, Bilel; Gargouri, Ilhem

    2017-05-01

    This research examines the causality (For the remainder of the paper, the notion of causality refers to Granger causality.) links among renewable energy consumption (REC), CO 2 emissions (CE), non-renewable energy consumption (NREC), and economic growth (GDP) using an autoregressive distributed lag model based on the pooled mean group estimation (ARDL-PMG) and applying Granger causality tests for a panel consisting of 22 African countries for the period between 1990 and 2011. There is unidirectional and irreversible short-run causality from CE to GDP. The causal direction between CE and REC is unobservable over the short-term. Moreover, we find unidirectional, short-run causality from REC to GDP. When testing per pair of variables, there are short-run bidirectional causalities among REC, CE, and GDP. However, if we add CE to the variables REC and NREC, the causality to GDP is observable, and causality from the pair REC and NREC to economic growth is neutral. Likewise, if we add NREC to the variables GDP and REC, there is causality. There are bidirectional long-run causalities among REC, CE, and GDP, which supports the feedback assumption. Causality from GDP to REC is not strong for the panel. If we test per pair of variables, the strong causality from GDP and CE to REC is neutral. The long-run PMG estimates show that NREC and gross domestic product increase CE, whereas REC decreases CE.

  15. Documentation of Helicopter Aeroelastic Stability Analysis Computer Program (HASTA)

    DTIC Science & Technology

    1977-12-01

    of the blade phasing assumption for which all blades of the rotor are identical and equally spaced azimuthally allows the size of the T. matrices...to be significantly reduced by the removal of the submatrices associated with blades other than the first blade. With the use of this assumption ...different program representational options such as the type of rotor system, the type of blades, and the use of the blade phasing assumption , the

  16. Interpreting findings from Mendelian randomization using the MR-Egger method.

    PubMed

    Burgess, Stephen; Thompson, Simon G

    2017-05-01

    Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.

  17. [NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 2:] Technical communications in aeronautics: Results of an exploratory study. An analysis of managers' and nonmanagers' responses

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Myron; Barclay, Rebecca O.; Oliu, Walter E.

    1989-01-01

    Data collected from an exploratory study concerned with the technical communications practices of aerospace engineers and scientists were analyzed to test the primary assumption that aerospace managers and nonmanagers have different technical communications practices. Five assumptions were established for the analysis. Aerospace managers and nonmanagers were found to have different technical communications practices for three of the five assumptions tested. Although aerospace managers and nonmanagers were found to have different technical communications practices, the evidence was neither conclusive nor compelling that the presumption of difference in practices could be attributed to the duties performed by aerospace managers and nonmanagers.

  18. Some observations on the use of discriminant analysis in ecology

    USGS Publications Warehouse

    Williams, B.K.

    1983-01-01

    The application of discriminant analysis in ecological investigations is discussed. The appropriate statistical assumptions for discriminant analysis are illustrated, and both classification and group separation approaches are outlined. Three assumptions that are crucial in ecological studies are discussed at length, and the consequences of their violation are developed. These assumptions are: equality of dispersions, identifiability of prior probabilities, and precise and accurate estimation of means and dispersions. The use of discriminant functions for purposes of interpreting ecological relationships is also discussed. It is suggested that the common practice of imputing ecological 'meaning' to the signs and magnitudes of coefficients be replaced by an assessment of 'structure coefficients.' Finally, the potential and limitations of representation of data in canonical space are considered, and some cautionary points are made concerning ecological interpretation of patterns in canonical space.

  19. Power, Revisited

    ERIC Educational Resources Information Center

    Roscigno, Vincent J.

    2011-01-01

    Power is a core theoretical construct in the field with amazing utility across substantive areas, levels of analysis and methodologies. Yet, its use along with associated assumptions--assumptions surrounding constraint vs. action and specifically organizational structure and rationality--remain problematic. In this article, and following an…

  20. Measuring Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale

    PubMed Central

    Brotherton, Robert; French, Christopher C.; Pickering, Alan D.

    2013-01-01

    The psychology of conspiracy theory beliefs is not yet well understood, although research indicates that there are stable individual differences in conspiracist ideation – individuals’ general tendency to engage with conspiracy theories. Researchers have created several short self-report measures of conspiracist ideation. These measures largely consist of items referring to an assortment of prominent conspiracy theories regarding specific real-world events. However, these instruments have not been psychometrically validated, and this assessment approach suffers from practical and theoretical limitations. Therefore, we present the Generic Conspiracist Beliefs (GCB) scale: a novel measure of individual differences in generic conspiracist ideation. The scale was developed and validated across four studies. In Study 1, exploratory factor analysis of a novel 75-item measure of non-event-based conspiracist beliefs identified five conspiracist facets. The 15-item GCB scale was developed to sample from each of these themes. Studies 2, 3, and 4 examined the structure and validity of the GCB, demonstrating internal reliability, content, criterion-related, convergent and discriminant validity, and good test-retest reliability. In sum, this research indicates that the GCB is a psychometrically sound and practically useful measure of conspiracist ideation, and the findings add to our theoretical understanding of conspiracist ideation as a monological belief system unpinned by a relatively small number of generic assumptions about the typicality of conspiratorial activity in the world. PMID:23734136

  1. Sample Collection, Analysis, and Respirator Use With Isocyanate Paints

    DTIC Science & Technology

    1990-02-01

    waterbath. 4. Acetonitrile. 6. Vials, 20-m. glass, with polypropylene-lined screw 5. Deionized water . caps. 6. Pentane. 7. Vials, 4-mK glass, with screw...Evaporator, Mini-Vap. 6 port, n, eqjivalent. sodium acetate in 1 L deionized 15. Flask, filtration, 500-mt. water . Add 1 L acetonitrile. Add 16. Funnel...methanol toie 5 mi way be stored at -21 uin the dark for at least solution. four weeks. Limit peibd of storage of samler at 6. Water , distilled. 25 *C

  2. Sunspot activity and influenza pandemics: a statistical assessment of the purported association.

    PubMed

    Towers, S

    2017-10-01

    Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.

  3. Plant ecosystem responses to rising atmospheric CO2: applying a "two-timing" approach to assess alternative hypotheses for mechanisms of nutrient limitation

    NASA Astrophysics Data System (ADS)

    Medlyn, B.; Jiang, M.; Zaehle, S.

    2017-12-01

    There is now ample experimental evidence that the response of terrestrial vegetation to rising atmospheric CO2 concentration is modified by soil nutrient availability. How to represent nutrient cycling processes is thus a key consideration for vegetation models. We have previously used model intercomparison to demonstrate that models incorporating different assumptions predict very different responses at Free-Air CO2 Enrichment experiments. Careful examination of model outputs has provided some insight into the reasons for the different model outcomes, but it is difficult to attribute outcomes to specific assumptions. Here we investigate the impact of individual assumptions in a generic plant carbon-nutrient cycling model. The G'DAY (Generic Decomposition And Yield) model is modified to incorporate alternative hypotheses for nutrient cycling. We analyse the impact of these assumptions in the model using a simple analytical approach known as "two-timing". This analysis identifies the quasi-equilibrium behaviour of the model at the time scales of the component pools. The analysis provides a useful mathematical framework for probing model behaviour and identifying the most critical assumptions for experimental study.

  4. Assumptions made when preparing drug exposure data for analysis have an impact on results: An unreported step in pharmacoepidemiology studies.

    PubMed

    Pye, Stephen R; Sheppard, Thérèse; Joseph, Rebecca M; Lunt, Mark; Girard, Nadyne; Haas, Jennifer S; Bates, David W; Buckeridge, David L; van Staa, Tjeerd P; Tamblyn, Robyn; Dixon, William G

    2018-04-17

    Real-world data for observational research commonly require formatting and cleaning prior to analysis. Data preparation steps are rarely reported adequately and are likely to vary between research groups. Variation in methodology could potentially affect study outcomes. This study aimed to develop a framework to define and document drug data preparation and to examine the impact of different assumptions on results. An algorithm for processing prescription data was developed and tested using data from the Clinical Practice Research Datalink (CPRD). The impact of varying assumptions was examined by estimating the association between 2 exemplar medications (oral hypoglycaemic drugs and glucocorticoids) and cardiovascular events after preparing multiple datasets derived from the same source prescription data. Each dataset was analysed using Cox proportional hazards modelling. The algorithm included 10 decision nodes and 54 possible unique assumptions. Over 11 000 possible pathways through the algorithm were identified. In both exemplar studies, similar hazard ratios and standard errors were found for the majority of pathways; however, certain assumptions had a greater influence on results. For example, in the hypoglycaemic analysis, choosing a different variable to define prescription end date altered the hazard ratios (95% confidence intervals) from 1.77 (1.56-2.00) to 2.83 (1.59-5.04). The framework offers a transparent and efficient way to perform and report drug data preparation steps. Assumptions made during data preparation can impact the results of analyses. Improving transparency regarding drug data preparation would increase the repeatability, reproducibility, and comparability of published results. © 2018 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  5. The influence of computational assumptions on analysing abdominal aortic aneurysm haemodynamics.

    PubMed

    Ene, Florentina; Delassus, Patrick; Morris, Liam

    2014-08-01

    The variation in computational assumptions for analysing abdominal aortic aneurysm haemodynamics can influence the desired output results and computational cost. Such assumptions for abdominal aortic aneurysm modelling include static/transient pressures, steady/transient flows and rigid/compliant walls. Six computational methods and these various assumptions were simulated and compared within a realistic abdominal aortic aneurysm model with and without intraluminal thrombus. A full transient fluid-structure interaction was required to analyse the flow patterns within the compliant abdominal aortic aneurysms models. Rigid wall computational fluid dynamics overestimates the velocity magnitude by as much as 40%-65% and the wall shear stress by 30%-50%. These differences were attributed to the deforming walls which reduced the outlet volumetric flow rate for the transient fluid-structure interaction during the majority of the systolic phase. Static finite element analysis accurately approximates the deformations and von Mises stresses when compared with transient fluid-structure interaction. Simplifying the modelling complexity reduces the computational cost significantly. In conclusion, the deformation and von Mises stress can be approximately found by static finite element analysis, while for compliant models a full transient fluid-structure interaction analysis is required for acquiring the fluid flow phenomenon. © IMechE 2014.

  6. Flood return level analysis of Peaks over Threshold series under changing climate

    NASA Astrophysics Data System (ADS)

    Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.

    2016-12-01

    Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.

  7. Comparison of 2D Finite Element Modeling Assumptions with Results From 3D Analysis for Composite Skin-Stiffener Debonding

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Paris, Isbelle L.; OBrien, T. Kevin; Minguet, Pierre J.

    2004-01-01

    The influence of two-dimensional finite element modeling assumptions on the debonding prediction for skin-stiffener specimens was investigated. Geometrically nonlinear finite element analyses using two-dimensional plane-stress and plane-strain elements as well as three different generalized plane strain type approaches were performed. The computed skin and flange strains, transverse tensile stresses and energy release rates were compared to results obtained from three-dimensional simulations. The study showed that for strains and energy release rate computations the generalized plane strain assumptions yielded results closest to the full three-dimensional analysis. For computed transverse tensile stresses the plane stress assumption gave the best agreement. Based on this study it is recommended that results from plane stress and plane strain models be used as upper and lower bounds. The results from generalized plane strain models fall between the results obtained from plane stress and plane strain models. Two-dimensional models may also be used to qualitatively evaluate the stress distribution in a ply and the variation of energy release rates and mixed mode ratios with delamination length. For more accurate predictions, however, a three-dimensional analysis is required.

  8. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  9. The contributions of interpersonal trauma exposure and world assumptions to predicting dissociation in undergraduates.

    PubMed

    Lilly, Michelle M

    2011-01-01

    This study examines the relationship between world assumptions and trauma history in predicting symptoms of dissociation. It was proposed that cognitions related to the safety and benevolence of the world, as well as self-worth, would be related to the presence of dissociative symptoms, the latter of which were theorized to defend against threats to one's sense of safety, meaningfulness, and self-worth. Undergraduates from a midwestern university completed the Multiscale Dissociation Inventory, World Assumptions Scale, and Traumatic Life Events Questionnaire. Consistent with the hypotheses, world assumptions were related to the extent of trauma exposure and interpersonal trauma exposure in the sample but were not significantly related to non-interpersonal trauma exposure. World assumptions acted as a significant partial mediator of the relationship between trauma exposure and dissociation, and this relationship held when interpersonal trauma exposure specifically was considered. The factor structures of dissociation and world assumptions were also examined using principal component analysis, with the benevolence and self-worth factors of the World Assumptions Scale showing the strongest relationships with trauma exposure and dissociation. Clinical implications are discussed.

  10. ADJECTIVES AS NOUN PHRASES.

    ERIC Educational Resources Information Center

    ROSS, JOHN ROBERT

    THIS ANALYSIS OF UNDERLYING SYNTACTIC STRUCTURE IS BASED ON THE ASSUMPTION THAT THE PARTS OF SPEECH CALLED "VERBS" AND "ADJECTIVES" ARE TWO SUBCATEGORIES OF ONE MAJOR LEXICAL CATEGORY, "PREDICATE." FROM THIS ASSUMPTION, THE HYPOTHESIS IS ADVANCED THAT, IN LANGUAGES EXHIBITING THE COPULA, THE DEEP STRUCTURE OF SENTENCES CONTAINING PREDICATE…

  11. Model specification in oral health-related quality of life research.

    PubMed

    Kieffer, Jacobien M; Verrips, Erik; Hoogstraten, Johan

    2009-10-01

    The aim of this study was to analyze conventional wisdom regarding the construction and analysis of oral health-related quality of life (OHRQoL) questionnaires and to outline statistical complications. Most methods used for developing and analyzing questionnaires, such as factor analysis and Cronbach's alpha, presume psychological constructs to be latent, inferring a reflective measurement model with the underlying assumption of local independence. Local independence implies that the latent variable explains why the variables observed are related. Many OHRQoL questionnaires are analyzed as if they were based on a reflective measurement model; local independence is thus assumed. This assumption requires these questionnaires to consist solely of items that reflect, instead of determine, OHRQoL. The tenability of this assumption is the main topic of the present study. It is argued that OHRQoL questionnaires are a mix of both a formative measurement model and a reflective measurement model, thus violating the assumption of local independence. The implications are discussed.

  12. Formalization and analysis of reasoning by assumption.

    PubMed

    Bosse, Tibor; Jonker, Catholijn M; Treur, Jan

    2006-01-02

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.

  13. Mission Command in the Age of Network-Enabled Operations: Social Network Analysis of Information Sharing and Situation Awareness

    DTIC Science & Technology

    2016-06-22

    this assumption in a large-scale, 2-week military training exercise. We conducted a social network analysis of email communications among the multi...exponential random graph models challenge the aforementioned assumption, as increased email output was associated with lower individual situation... email links were more commonly formed among members of the command staff with both similar functions and levels of situation awareness, than between

  14. Comparison of the clinical outcomes between antiviral-naïve patients treated with entecavir and lamivudine-resistant patients receiving adefovir add-on lamivudine combination treatment

    PubMed Central

    Kim, Hong Joo; Park, Soo Kyung; Yang, Hyo Joon; Jung, Yoon Suk; Park, Jung Ho; Park, Dong Il; Cho, Yong Kyun; Sohn, Chong Il; Jeon, Woo Kyu; Kim, Byung Ik; Choi, Kyu Yong

    2016-01-01

    Background/Aims To analyze the effects of preexisting lamivudine (LAM) resistance and applying antiviral treatment (adefovir [ADV] add-on LAM combination treatment) on long-term treatment outcomes, and comparing the clinical outcomes of antiviral-naïve chronic hepatitis B patients receiving entecavir (ETV) monotherapy. Methods This study enrolled 73 antiviral-naïve patients who received 0.5-mg ETV as an initial therapy and 54 patients who received ADV add-on LAM combination treatment as a rescue therapy from July 2006 to July 2010. Results During 24-month treatments, the decreases in serum log10HBV-DNA values (copies/mL) were significantly greater in the antiviral-naïve patients treated with ETV than the patients receiving ADV add-on LAM combination treatment. The biochemical response rates for alanine aminotransferase normalization at 6 months (ETV) and 12 months (ADV add-on LAM) were 90.4% (66/73) and 77.8% (42/54), respectively (P=0.048). A Kaplan-Meier analysis indicated that the rates of serologic response, viral breakthrough, and emergence of genotypic resistance did not differ significantly between the two patient groups. There were also no significant intergroup differences in the rates of disease progression (PD) and new development of hepatocellular carcinoma (HCC). Conclusion The long-term clinical outcomes of antiviral-naïve patients treated with ETV and LAM-resistant patients receiving ADV add-on LAM combination treatment were comparable in terms of the emergence of HCC and disease progression. PMID:27729626

  15. Cost-effectiveness analysis of exenatide twice daily (BID) vs insulin glargine once daily (QD) as add-on therapy in Chinese patients with Type 2 diabetes mellitus inadequately controlled by oral therapies.

    PubMed

    Deng, Jing; Gu, Shuyan; Shao, Hui; Dong, Hengjin; Zou, Dajin; Shi, Lizheng

    2015-01-01

    To estimate cost-effectiveness of exenatide twice daily (BID) vs insulin glargine once daily (QD) as add-on therapy in Chinese type 2 diabetes patients not well controlled by oral anti-diabetic (OAD) agents. The Cardiff model was populated with data synthesized from three head-to-head randomized clinical trials of up to 30 weeks in China comparing exenatide BID vs insulin glargine as add-on therapies to oral therapies in the Chinese population. The Cardiff model generated outputs including macrovascular and microvascular complications, diabetes-specific mortality, costs, and quality-adjusted life years (QALYs). Cost and QALYs were estimated with a time horizon of 40 years at a discount rate of 3% from a societal perspective. Compared with insulin glargine plus OAD treatments, patients on exenatide BID plus OAD gained 1.88 QALYs, at an incremental cost saving of Chinese Renminbi (RMB) 114,593 (i.e., cost saving of RMB 61078/QALY). The cost-effectiveness results were robust to various sensitivity analyses including probabilistic sensitivity analysis. The variables with the most impact on incremental cost-effectiveness ratio included HbA1c level at baseline, health utilities decrement, and BMI at baseline. Compared with insulin glargine QD, exenatide BID as add-on therapy to OAD is a cost-effective treatment in Chinese patients inadequately controlled by OAD treatments.

  16. Housing flexibility effects on rotor stability

    NASA Technical Reports Server (NTRS)

    Davis, L. B.; Wolfe, E. A.; Beatty, R. F.

    1985-01-01

    Preliminary rotordynamic evaluations are performed with a housing stiffness assumption that is typically determined only after the hardware is built. In addressing rotor stability, a rigid housing assumption was shown to predict an instability at a lower spin speed than a comparable flexible housing analysis. This rigid housing assumption therefore provides a conservative estimate of the stability threshold speed. A flexible housing appears to act as an energy absorber and dissipated some of the destabilizing force. The fact that a flexible housing is usually asymmetric and considerably heavier than the rotor was related to this apparent increase in rotor stability. Rigid housing analysis is proposed as a valuable screening criteria and may save time and money in construction of elaborate housing finite element models for linear stability analyses.

  17. Fourier's law of heat conduction: quantum mechanical master equation analysis.

    PubMed

    Wu, Lian-Ao; Segal, Dvira

    2008-06-01

    We derive the macroscopic Fourier's Law of heat conduction from the exact gain-loss time convolutionless quantum master equation under three assumptions for the interaction kernel. To second order in the interaction, we show that the first two assumptions are natural results of the long time limit. The third assumption can be satisfied by a family of interactions consisting of an exchange effect. The pure exchange model directly leads to energy diffusion in a weakly coupled spin- 12 chain.

  18. Effects of real fluid properties on axial turbine meanline design and off-design analysis

    NASA Astrophysics Data System (ADS)

    MacLean, Cameron

    The effects of real fluid properties on axial turbine meanline analysis have been investigated employing two meanline analysis codes, namely Turbine Meanline Design (TMLD) and Turbine Meanline Off-Design (TMLO). The previously developed TMLD code assumed the working fluid was an ideal gas. Therefore it was modified to use real fluid properties. TMLO was then developed from TMLD Both codes can be run using either the ideal gas assumption or real fluid properties. TMLD was employed for the meanline design of several axial turbines for a range of inlet conditions, using both the ideal gas assumption and real fluid properties. The resulting designs were compared to see the effects of real fluid properties. Meanline designs, generated using the ideal gas assumption, were then analysed with TMLO using real fluid properties. This was done over a range of inlet conditions that correspond to varying degrees of departure from ideal gas conditions. The goal was to show how machines designed with the ideal gas assumption would perform with the real working fluid. The working fluid used in both investigations was supercritical carbon dioxide. Results from the investigation show that real fluid properties had a strong effect on the gas path areas of the turbine designs as well as the performance of turbines designed using the ideal gas assumption. Specifically, power output and the velocities of the working fluid were affected. It was found that accounting for losses tended to lessen the effects of the real fluid properties.

  19. Acupuncture and related therapies used as add-on or alternative to prokinetics for functional dyspepsia: overview of systematic reviews and network meta-analysis.

    PubMed

    Ho, Robin S T; Chung, Vincent C H; Wong, Charlene H L; Wu, Justin C Y; Wong, Samuel Y S; Wu, Irene X Y

    2017-09-04

    Prokinetics for functional dyspepsia (FD) have relatively higher number needed to treat values. Acupuncture and related therapies could be used as add-on or alternative. An overview of systematic reviews (SRs) and network meta-analyses (NMA) were performed to evaluate the comparative effectiveness of different acupuncture and related therapies. We conducted a comprehensive literature search for SRs of randomized controlled trials (RCTs) in eight international and Chinese databases. Data from eligible RCTs were extracted for random effect pairwise meta-analyses. NMA was used to explore the most effective treatment among acupuncture and related therapies used alone or as add-on to prokinetics, compared to prokinetics alone. From five SRs, 22 RCTs assessing various acupuncture and related therapies were included. No serious adverse events were reported. Two pairwise meta-analyses showed manual acupuncture has marginally stronger effect in alleviating global FD symptoms, compared to domperidone or itopride. Results from NMA showed combination of manual acupuncture and clebopride has the highest probability in alleviating patient reported global FD symptom. Combination of manual acupuncture and clebopride has the highest probability of being the most effective treatment for FD symptoms. Patients who are contraindicated for prokinetics may use manual acupuncture or moxibustion as alternative. Future confirmatory comparative effectiveness trials should compare clebopride add-on manual acupuncture with domperidone add-on manual acupuncture and moxibustion.

  20. Fiber optic interconnect and optoelectronic packaging challenges for future generation avionics

    NASA Astrophysics Data System (ADS)

    Beranek, Mark W.

    2007-02-01

    Forecasting avionics industry fiber optic interconnect and optoelectronic packaging challenges that lie ahead first requires an assumption that military avionics architectures will evolve from today's centralized/unified concept based on gigabit laser, optical-to-electrical-to-optical switching and optical backplane technology, to a future federated/distributed or centralized/unified concept based on gigabit tunable laser, electro-optical switch and add-drop wavelength division multiplexing (WDM) technology. The requirement to incorporate avionics optical built-in test (BIT) in military avionics fiber optic systems is also assumed to be correct. Taking these assumptions further indicates that future avionics systems engineering will use WDM technology combined with photonic circuit integration and advanced packaging to form the technical basis of the next generation military avionics onboard local area network (LAN). Following this theme, fiber optic cable plants will evolve from today's multimode interconnect solution to a single mode interconnect solution that is highly installable, maintainable, reliable and supportable. Ultimately optical BIT for fiber optic fault detection and isolation will be incorporated as an integral part of a total WDM-based avionics LAN solution. Cost-efficient single mode active and passive photonic component integration and packaging integration is needed to enable reliable operation in the harsh military avionics application environment. Rugged multimode fiber-based transmitters and receivers (transceivers) with in-package optical BIT capability are also needed to enable fully BIT capable single-wavelength fiber optic links on both legacy and future aerospace platforms.

  1. Direct Position Determination of Unknown Signals in the Presence of Multipath Propagation

    PubMed Central

    Yu, Hongyi

    2018-01-01

    A novel geolocation architecture, termed “Multiple Transponders and Multiple Receivers for Multiple Emitters Positioning System (MTRE)” is proposed in this paper. Existing Direct Position Determination (DPD) methods take advantage of a rather simple channel assumption (line of sight channels with complex path attenuations) and a simplified MUltiple SIgnal Classification (MUSIC) algorithm cost function to avoid the high dimension searching. We point out that the simplified assumption and cost function reduce the positioning accuracy because of the singularity of the array manifold in a multi-path environment. We present a DPD model for unknown signals in the presence of Multi-path Propagation (MP-DPD) in this paper. MP-DPD adds non-negative real path attenuation constraints to avoid the mistake caused by the singularity of the array manifold. The Multi-path Propagation MUSIC (MP-MUSIC) method and the Active Set Algorithm (ASA) are designed to reduce the dimension of searching. A Multi-path Propagation Maximum Likelihood (MP-ML) method is proposed in addition to overcome the limitation of MP-MUSIC in the sense of a time-sensitive application. An iterative algorithm and an approach of initial value setting are given to make the MP-ML time consumption acceptable. Numerical results validate the performances improvement of MP-MUSIC and MP-ML. A closed form of the Cramér–Rao Lower Bound (CRLB) is derived as a benchmark to evaluate the performances of MP-MUSIC and MP-ML. PMID:29562601

  2. Uncertainties and Systematic Effects on the estimate of stellar masses in high z galaxies

    NASA Astrophysics Data System (ADS)

    Salimbeni, S.; Fontana, A.; Giallongo, E.; Grazian, A.; Menci, N.; Pentericci, L.; Santini, P.

    2009-05-01

    We discuss the uncertainties and the systematic effects that exist in the estimates of the stellar masses of high redshift galaxies, using broad band photometry, and how they affect the deduced galaxy stellar mass function. We use at this purpose the latest version of the GOODS-MUSIC catalog. In particular, we discuss the impact of different synthetic models, of the assumed initial mass function and of the selection band. Using Chariot & Bruzual 2007 and Maraston 2005 models we find masses lower than those obtained from Bruzual & Chariot 2003 models. In addition, we find a slight trend as a function of the mass itself comparing these two mass determinations with that from Bruzual & Chariot 2003 models. As consequence, the derived galaxy stellar mass functions show diverse shapes, and their slope depends on the assumed models. Despite these differences, the overall results and scenario is observed in all these cases. The masses obtained with the assumption of the Chabrier initial mass function are in average 0.24 dex lower than those from the Salpeter assumption, at all redshifts, causing a shift of galaxy stellar mass function of the same amount. Finally, using a 4.5 μm-selected sample instead of a Ks-selected one, we add a new population of highly absorbed, dusty galaxies at z~=2-3 of relatively low masses, yielding stronger constraints on the slope of the galaxy stellar mass function at lower masses.

  3. Direct Position Determination of Unknown Signals in the Presence of Multipath Propagation.

    PubMed

    Du, Jianping; Wang, Ding; Yu, Wanting; Yu, Hongyi

    2018-03-17

    A novel geolocation architecture, termed "Multiple Transponders and Multiple Receivers for Multiple Emitters Positioning System (MTRE)" is proposed in this paper. Existing Direct Position Determination (DPD) methods take advantage of a rather simple channel assumption (line of sight channels with complex path attenuations) and a simplified MUltiple SIgnal Classification (MUSIC) algorithm cost function to avoid the high dimension searching. We point out that the simplified assumption and cost function reduce the positioning accuracy because of the singularity of the array manifold in a multi-path environment. We present a DPD model for unknown signals in the presence of Multi-path Propagation (MP-DPD) in this paper. MP-DPD adds non-negative real path attenuation constraints to avoid the mistake caused by the singularity of the array manifold. The Multi-path Propagation MUSIC (MP-MUSIC) method and the Active Set Algorithm (ASA) are designed to reduce the dimension of searching. A Multi-path Propagation Maximum Likelihood (MP-ML) method is proposed in addition to overcome the limitation of MP-MUSIC in the sense of a time-sensitive application. An iterative algorithm and an approach of initial value setting are given to make the MP-ML time consumption acceptable. Numerical results validate the performances improvement of MP-MUSIC and MP-ML. A closed form of the Cramér-Rao Lower Bound (CRLB) is derived as a benchmark to evaluate the performances of MP-MUSIC and MP-ML.

  4. Modeling Nonlinear Errors in Surface Electromyography Due To Baseline Noise: A New Methodology

    PubMed Central

    Law, Laura Frey; Krishnan, Chandramouli; Avin, Keith

    2010-01-01

    The surface electromyographic (EMG) signal is often contaminated by some degree of baseline noise. It is customary for scientists to subtract baseline noise from the measured EMG signal prior to further analyses based on the assumption that baseline noise adds linearly to the observed EMG signal. The stochastic nature of both the baseline and EMG signal, however, may invalidate this assumption. Alternately, “true” EMG signals may be either minimally or nonlinearly affected by baseline noise. This information is particularly relevant at low contraction intensities when signal-to-noise ratios (SNR) may be lowest. Thus, the purpose of this simulation study was to investigate the influence of varying levels of baseline noise (approximately 2 – 40 % maximum EMG amplitude) on mean EMG burst amplitude and to assess the best means to account for signal noise. The simulations indicated baseline noise had minimal effects on mean EMG activity for maximum contractions, but increased nonlinearly with increasing noise levels and decreasing signal amplitudes. Thus, the simple baseline noise subtraction resulted in substantial error when estimating mean activity during low intensity EMG bursts. Conversely, correcting EMG signal as a nonlinear function of both baseline and measured signal amplitude provided highly accurate estimates of EMG amplitude. This novel nonlinear error modeling approach has potential implications for EMG signal processing, particularly when assessing co-activation of antagonist muscles or small amplitude contractions where the SNR can be low. PMID:20869716

  5. An Analysis of the Economic Assumptions Underlying Fiscal Plans FY1981 - FY1984.

    DTIC Science & Technology

    1986-06-01

    OF THE ECONOMIC ASSUMPTIONS UNDERLYING FISCAL PLANS FY1981 - FY1984 by Robert Welch Beck June 1986 Thesis Advisor: P. M. CARRICK Approved for public ...DOWGRDIN SHEDLEApproved for public releace; it - 2b ECLSSIICAIONI DWNGAD G SHEDLEbut ion is unlimited. 4! PERFORMING ORGANIZATION REPORT NUMBER(S) S...SECURITY CLASSIFICATION OF T4𔃿 PAC~E All other editions are obsolete Approved for public release; distribution is unlimited. An Analysis of the

  6. Investigation of parabolic computational techniques for internal high-speed viscous flows

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Power, G. D.

    1985-01-01

    A feasibility study was conducted to assess the applicability of an existing parabolic analysis (ADD-Axisymmetric Diffuser Duct), developed previously for subsonic viscous internal flows, to mixed supersonic/subsonic flows with heat addition simulating a SCRAMJET combustor. A study was conducted with the ADD code modified to include additional convection effects in the normal momentum equation when supersonic expansion and compression waves were present. It is concluded from the present study that for the class of problems where strong viscous/inviscid interactions are present a global iteration procedure is required.

  7. Hybrid Approaches and Industrial Applications of Pattern Recognition,

    DTIC Science & Technology

    1980-10-01

    emphasized that the probability distribution in (9) is correct only under the assumption that P( wIx ) is known exactly. In practice this assumption will...sufficient precision. The alternative would be to take the probability distribution of estimates of P( wix ) into account in the analysis. However, from the

  8. Estimating Causal Effects in Mediation Analysis Using Propensity Scores

    ERIC Educational Resources Information Center

    Coffman, Donna L.

    2011-01-01

    Mediation is usually assessed by a regression-based or structural equation modeling (SEM) approach that we refer to as the classical approach. This approach relies on the assumption that there are no confounders that influence both the mediator, "M", and the outcome, "Y". This assumption holds if individuals are randomly…

  9. School, Cultural Diversity, Multiculturalism, and Contact

    ERIC Educational Resources Information Center

    Pagani, Camilla; Robustelli, Francesco; Martinelli, Cristina

    2011-01-01

    The basic assumption of this paper is that school's potential to improve cross-cultural relations, as well as interpersonal relations in general, is enormous. This assumption is supported by a number of theoretical considerations and by the analysis of data we obtained from a study we conducted on the attitudes toward diversity and…

  10. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    ERIC Educational Resources Information Center

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  11. Stirling Engine External Heat System Design with Heat Pipe Heater.

    DTIC Science & Technology

    1986-07-01

    Figure 10. However, the evaporator analysis is greatly simplified by making the conservative assumption of constant heat flux. This assumption results in...number Cold Start Data * " ROM density of the metal, gr/cm 3 CAPM specific heat of the metal, cal./gr. K ETHG effective gauze thickness: the

  12. Application of the Auto-Tuned Land Assimilation System (ATLAS) to ASCAT and SMOS soil moisture retrieval products

    USDA-ARS?s Scientific Manuscript database

    Land data assimilations are typically based on highly uncertain assumptions regarding the statistical structure of observation and modeling errors. Left uncorrected, poor assumptions can degrade the quality of analysis products generated by land data assimilation systems. Recently, Crow and van de...

  13. Timber value—a matter of choice: a study of how end use assumptions affect timber values.

    Treesearch

    John H. Beuter

    1971-01-01

    The relationship between estimated timber values and actual timber prices is discussed. Timber values are related to how, where, and when the timber is used. An analysis demonstrates the relative values of a typical Douglas-fir stand under assumptions about timber use.

  14. 76 FR 32241 - Civil Service Retirement System; Present Value Factors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... in the economic assumptions adopted by the Board of Actuaries of the Civil Service Retirement System... data to the Board of Actuaries, care of Gregory Kissel, Actuary, Office of Planning and Policy Analysis...- 335, based on changed economic assumptions adopted by the Board of Actuaries of the CSRS. Those...

  15. PC-ECONPACK Version 3.0 Users Manual.

    DTIC Science & Technology

    1991-11-01

    for inclusi on in cost sensitivity analysis UPDer Limit of the Change (%). PC-ECONPACK varies all the stipulated expense items both in an upward...Assumptions as stated in the Assumptions text information block. This section is used to discuss why certain alternatives were selected for inclusi ,n in the

  16. Evaluation of the adhesive properties of the cornea by means of optical coherence tomography in patients with meibomian gland dysfunction and lacrimal tear deficiency.

    PubMed

    Napoli, Pietro Emanuele; Coronella, Franco; Satta, Giovanni Maria; Galantuomo, Maria Silvana; Fossarello, Maurizio

    2014-01-01

    The aim was to determine the influence of meibomian gland dysfunction (MGD) and aqueous tear deficiency dry eye (ADDE) on the adhesive properties of the central cornea by means of optical coherence tomography (OCT), and to investigate the relationship between corneal adhesiveness and classical tear tests, as well as the reliability of results, in these lacrimal functional unit disorders. Prospective, case-control study. Twenty-eight patients with MGD and 27 patients with ADDE were studied. A group of 32 healthy subjects of similar age and gender distribution served as a control group. The adhesive properties of the anterior corneal surface were measured by OCT, based on the retention time of adhesion marker above it, in all participants. An excellent (≥5 minutes), borderline (within 3-5 minutes), fair (within 1-3 minutes) and poor (<1 minute) values of corneal adhesiveness were found, respectively, in 0%, 7.1%, 64.3% and 28.6% of MGD, in 0%, 7.4%, 63% and 29.6% of ADDE, and in 31.3%, 65.6%, 3.1% and 0% of healthy patients. The differences in time of corneal adhesiveness between MGD and healthy patients, as well as between ADDE and healthy patients, were found to be statistically significant (p<0.001; p<0.001; respectively). Conversely, no statistical significant differences between MGD and ADDE were found (p = 0.952). Data analysis revealed a statistically significant correlation between corneal adhesiveness and clinical tests of dry eye, as well as an excellent degree of inter-rater reliability and reproducibility for OCT measurements (p<0.001). ADDE and MGD share similar abnormalities on OCT imaging. Decreased adhesive properties of the anterior cornea were identified as a common feature of MGD and ADDE. This simple OCT approach may provide new clues into the mechanism and evaluation of dry eye syndrome.

  17. Association between genetic variants of the ADD1 and GNB3 genes and blood pressure response to the cold pressor test in a Chinese Han population: the GenSalt Study.

    PubMed

    Wang, Laiyuan; Chen, Shufeng; Zhao, Qi; Hixson, James E; Rao, Dabeeru C; Jaquish, Cashell E; Huang, Jianfeng; Lu, Xiangfeng; Chen, Jichun; Cao, Jie; Li, Jianxin; Li, Hongfan; He, Jiang; Liu, De-Pei; Gu, Dongfeng

    2012-08-01

    Genetic factors influence blood pressure (BP) response to the cold pressor test (CPT), which is a phenotype related to hypertension risk. We examined the association between variants of the α-adducin (ADD1) and guanine nucleotide binding protein (G protein) β-polypeptide 3 (GNB3) genes and BP response to the CPT. A total of 1998 Han Chinese participants from the Genetic Epidemiology Network of Salt Sensitivity completed the CPT. The area under the curve (AUC) above the baseline BP during the CPT was used to measure the BP response. Twelve single-nucleotide polymorphisms (SNPs) of the ADD1 and GNB3 genes were selected and genotyped. Both single-marker and haplotype association analyses were conducted using linear mixed models. The rs17833172 and rs3775067 SNPs of the ADD1 gene and the rs4963516 SNP of the GNB3 gene were significantly associated with the BP response to CPT, even after adjusting for multiple testing. For the ADD1 gene, the AA genotype of SNP rs17833172 was associated with lower systolic BP (SBP) reactivity (P<0.0001) and faster BP recovery (P=0.0003). The TT genotype of rs3775067 was associated with slower SBP recovery (P=0.004). For the GNB3 gene, the C allele of SNP rs4963516 was associated with faster diastolic BP recovery (P=0.002) and smaller overall AUC (P=0.003). Haplotype analysis indicated that the CCGC haplotype of ADD1 constructed by rs1263359, rs3775067, rs4961 and rs4963 was significantly associated with the BP response to CPT. These data suggest that genetic variants of the ADD1 and GNB3 genes may have important roles in BP response to the CPT. Future studies aimed at replicating these novel findings are warranted.

  18. Synthesis of High-Speed Digital Systems.

    DTIC Science & Technology

    1985-11-08

    1 (sub2 sub 16 2 (sub3 sub 16) 3 (sub4 sub 16) 4 (eub5 sub 16) 5 (sub6 sub 16) 6 ( sub7 sub 16) 7 (addi add 16) 8 (add2 add 16) 9 (add3 add 16) 10...seB uub5 J2 16 se5) 15 (se6 sub6 JI 16 soO) 18 (se7 sub7 J5 16 se7) 17 (aol addi Dl 16 aol) 18 (a921 add2 add7 18 a02) 19 (&922 add2 add5 16 a02) 20...de4l D4 add4 16 de4) 33 Wd942 D4 sub4 16 de4) 34 (de~i D5 sub7 16 de5) 35 (deS2 D5 add8 16 deS) 36 (jell Ji add7 16 jel) 37 (je12 JI D5 16 jel) 38 (je2

  19. Sensitivity analysis of the add-on price estimate for the edge-defined film-fed growth process

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.; Kachare, A. H.

    1981-01-01

    The analysis is in terms of cost parameters and production parameters. The cost parameters include equipment, space, direct labor, materials, and utilities. The production parameters include growth rate, process yield, and duty cycle. A computer program was developed specifically to do the sensitivity analysis.

  20. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  1. Empirical Analysis of Retirement Pension and IFRS Adoption Effects on Accounting Information: Glance at IT Industry

    PubMed Central

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended. PMID:25013868

  2. Empirical analysis of retirement pension and IFRS adoption effects on accounting information: glance at IT industry.

    PubMed

    Kim, JeongYeon

    2014-01-01

    This study reviews new pension accounting with K-IFRS and provides empirical changes in liability for retirement allowances with adoption of K-IFRS. It will help to understand the effect of pension accounting on individual firm's financial report and the importance of public announcement of actuarial assumptions. Firms that adopted K-IFRS had various changes in retirement liability compared to the previous financial report not based on K-IFRS. Their actuarial assumptions for pension accounting should be announced, but only few of them were published. Data analysis shows that the small differences of the actuarial assumption may result in a big change of retirement related liability. Firms within IT industry also have similar behaviors, which means that additional financial regulations for pension accounting are recommended.

  3. Impact of add-on laboratory testing at an academic medical center: a five year retrospective study.

    PubMed

    Nelson, Louis S; Davis, Scott R; Humble, Robert M; Kulhavy, Jeff; Aman, Dean R; Krasowski, Matthew D

    2015-01-01

    Clinical laboratories frequently receive orders to perform additional tests on existing specimens ('add-ons'). Previous studies have examined add-on ordering patterns over short periods of time. The objective of this study was to analyze add-on ordering patterns over an extended time period. We also analyzed the impact of a robotic specimen archival/retrieval system on add-on testing procedure and manual effort. In this retrospective study at an academic medical center, electronic health records from were searched to obtain all add-on orders that were placed in the time period of May 2, 2009 to December 31, 2014. During the time period of retrospective study, 880,359 add-on tests were ordered on 96,244 different patients. Add-on testing comprised 3.3 % of total test volumes. There were 443,411 unique ordering instances, leading to an average of 1.99 add-on tests per instance. Some patients had multiple episodes of add-on test orders at different points in time, leading to an average of 9.15 add-on tests per patient. The majority of add-on orders were for chemistry tests (78.8 % of total add-ons) with the next most frequent being hematology and coagulation tests (11.2 % of total add-ons). Inpatient orders accounted for 66.8 % of total add-on orders, while the emergency department and outpatient clinics had 14.8 % and 18.4 % of total add-on orders, respectively. The majority of add-ons were placed within 8 hours (87.3 %) and nearly all by 24 hours (96.8 %). Nearly 100 % of add-on orders within the emergency department were placed within 8 hours. The introduction of a robotic specimen archival/retrieval unit saved an average of 2.75 minutes of laboratory staff manual time per unique add-on order. This translates to 24.1 hours/day less manual effort in dealing with add-on orders. Our study reflects the previous literature in showing that add-on orders significantly impact the workload of the clinical laboratory. The majority of add-on orders are clinical chemistry tests, and most add-on orders occur within 24 hours of original specimen collection. Robotic specimen archival/retrieval units can reduce manual effort in the clinical laboratory associated with add-on orders.

  4. A control-volume method for analysis of unsteady thrust augmenting ejector flows

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1988-01-01

    A method for predicting transient thrust augmenting ejector characteristics is presented. The analysis blends classic self-similar turbulent jet descriptions with a control volume mixing region discretization to solicit transient effects in a new way. Division of the ejector into an inlet, diffuser, and mixing region corresponds with the assumption of viscous-dominated phenomenon in the latter. Inlet and diffuser analyses are simplified by a quasi-steady analysis, justified by the assumptions that pressure is the forcing function in those regions. Details of the theoretical foundation, the solution algorithm, and sample calculations are given.

  5. The New Nordic Diet: phosphorus content and absorption.

    PubMed

    Salomo, Louise; Poulsen, Sanne K; Rix, Marianne; Kamper, Anne-Lise; Larsen, Thomas M; Astrup, Arne

    2016-04-01

    High phosphorus content in the diet may have adverse effect on cardiovascular health. We investigated whether the New Nordic Diet (NND), based mainly on local, organic and less processed food and large amounts of fruit, vegetables, wholegrain and fish, versus an Average Danish Diet (ADD) would reduce the phosphorus load due to less phosphorus-containing food additives, animal protein and more plant-based proteins. Phosphorus and creatinine were measured in plasma and urine at baseline, week 12 and week 26 in 132 centrally obese subjects with normal renal function as part of a post hoc analysis of data acquired from a 26-week controlled trial. We used the fractional phosphorus excretion as a measurement of phosphorus absorption. Mean baseline fractional phosphorus excretion was 20.9 ± 6.6 % in the NND group (n = 82) and 20.8 ± 5.5 % in the ADD group (n = 50) and was decreased by 2.8 ± 5.1 and 3.1 ± 5.4 %, respectively, (p = 0.6) at week 26. At week 26, the mean change in plasma phosphorus was 0.04 ± 0.12 mmol/L in the NND group and -0.03 ± 0.13 mmol/L in the ADD group (p = 0.001). Mean baseline phosphorus intake was 1950 ± 16 mg/10 MJ in the NND group and 1968 ± 22 mg/10 MJ in the ADD group and decreased less in the NND compared to the ADD (67 ± 36 mg/10 MJ and -266 ± 45 mg/day, respectively, p < 0.298). Contrary to expectations, the NND had a high phosphorus intake and did not decrease the fractional phosphorus excretion compared with ADD. Further modifications of the diet are needed in order to make this food concept beneficial regarding phosphorus absorption.

  6. Angiotensin-Converting Enzyme (ACE) I/D and Alpha-Adducin (ADD1) G460W Gene Polymorphisms in Turkish Patients with Severe Chronic Tinnitus.

    PubMed

    Yuce, Salim; Sancakdar, Enver; Bağcı, Gokhan; Koc, Sema; Kurtulgan, Hande Kucuk; Bağcı, Binnur; Doğan, Mansur; Uysal, İsmail Onder

    2016-04-01

    Tinnitus is described as a disturbing sound sensation in the absence of external stimulation. We aimed to investigate whether there is any relationship between severe chronic tinnitus and angiotensin-converting enzyme (ACE) I/D and α-adducin (ADD1) G460W gene polymorphisms. The patient group and control group consisted of 89 and 104 individuals, respectively. The evaluation of tinnitus was performed using the Strukturiertes Tinnitus-Interview (STI). The Tinnitus Handicap Inventory (THI) was used to evaluate the tinnitus severity. Polymerase chain reaction (PCR) and polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) techniques were used for genotyping. With regard to the ACE I/D polymorphism, there was no significant difference in genotype and allele frequencies between the patient group and control group. However, a statistically significant difference was found in genotype (p<0.01) and allele frequencies (p=0.021) of the ADD1 G460W gene polymorphism. Combined genotype analysis showed that the ACE II /ADD1 GW genotype was statistically significantly higher in the patient group than in the control group (X2: 7.15, p=0.007). The odds ratio value of the GW genotype was 2.5 (95% CI=1.4-4.7) (p<0.01). Our results demonstrate an association between ADD1 G460W gene polymorphism and susceptibility to severe chronic tinnitus. It was found that the GW genotype increased the disease risk by 2.5-fold compared with other genotypes. This indicates that ADD1 G460W polymorphism could be an important factor in the pathophysiology of tinnitus.

  7. Credibility of Uncertainty Analyses for 131-I Pathway Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, F O.; Anspaugh, L. R.; Apostoaei, A. I.

    2004-05-01

    We would like to make your readers aware of numerous concerns we have with respect to the paper by A. A. Simpkins and D. M. Hamby on Uncertainty in transport factors used to calculate historic dose from 131I releases at the Savannah River Site. The paper by Simpkins and Hamby concludes by saying their uncertainty analysis would add credibility to current dose reconstruction efforts of public exposures to historic releases of 131I from the operations at the Savannah River Site, yet we have found their paper to be afflicted with numerous errors in assumptions and methodology, which in turn leadmore » to grossly misleading conclusions. Perhaps the most egregious errors are their conclusions, which state that: a. the vegetable pathway, not the ingestion of fresh milk, was the main contributor to thyroid dose for exposure to 131I (even though dietary intake of vegetables was less in the past than at present), and b. the probability distribution assigned to the fraction of iodine released in the elemental form (Uniform 0, 0.6) is responsible for 64.6% of the total uncertainty in thyroid dose, given a unit release of 131I to the atmosphere. The assumptions used in the paper by Simpkins and Hamby lead to a large overestimate of the contamination of vegetables by airborne 131I. The interception by leafy and non-leafy vegetables of freshly deposited 131I is known to be highly dependent on the growth form of the crop and the standing crop biomass of leafy material. Unrealistic assumptions are made for losses of 131I from food processing, preparation, and storage prior to human consumption. These assumptions tend to bias their conclusions toward an overestimate of the amount of 131I retained by vegetation prior to consumption. For example, the generic assumption of a 6-d hold-up time is used for the loss from radioactive decay for the time period from harvest to human consumption of fruits, vegetables, and grains. We anticipate hold-up times of many weeks, if not months, between harvest and consumption for most grains and non-leafy forms of vegetation. The combined assumptions made by Simpkins and Hamby about the fraction of fresh deposition intercepted by vegetation, and the rather short hold-up time for most vegetables consumed, probably caused the authors to conclude that the consumption of 131I-contaminated vegetables was more important to dose than was the consumption of fresh sources of milk. This conclusion is surprising, given that the consumption rate assumed for whole milk was rather large and that the value of the milk transfer coefficient was also higher and more uncertain than most distributions reported in the literature. In our experience, the parameters contributing most to the uncertainty in dose for the 131I air-deposition-vegetation-milk-human-thyroid pathway are the deposition velocity for elemental iodine, the mass interception factor for pasture vegetation, the milk transfer coefficient, and the thyroid dose conversion factor. In none of our previous investigations has the consumption of fruits, vegetables, and grains been the dominant contributor to the thyroid dose (or the uncertainty in dose) when the individual also was engaged in the consumption of even moderate quantities of fresh milk. The results of the relative contribution of uncertain input parameters to the overall uncertainty in exposure are counterintuitive. We suspect that calculational errors may have occurred in their application of the software that was used to estimate the relative sensitivity for each uncertain input variable. Their claim that the milk transfer coefficient contributed only 4% to the total uncertainty in the aggregated transfer from release to dose, and that the uncertainty in the vegetation interception fraction contributed only 3.3%, despite relatively large uncertainties assigned to both of these variables, violates our sense of face validity.« less

  8. Beyond Anxious Predisposition: Do Padecer de Nervios and Ataque de Nervios Add Incremental Validity to Predictions of Current Distress among Mexican Mothers?

    PubMed Central

    Alcántara, Carmela; Abelson, James L.; Gone, Joseph P.

    2011-01-01

    Background Nervios (PNRV) and ataque de nervios (ATQ) are culture-bound syndromes with overlapping symptoms of anxiety, depression, and dissociation, shown to have inconsistent associations to psychiatric disorder. Few studies test the basic assumption that PNRV and ATQ are uniformly linked to distress outcomes across Latina/o immigrant groups. This study examined: (a) the extent to which acculturative stress, Latino/U.S. American acculturation, and anxious predisposition were associated with lifetime history of ATQ and PNRV, and (b) the extent to which ATQ and PNRV add incremental validity in explaining acculturative stress and psychological distress beyond measures of anxious predisposition. Method Participants (n = 82) included Mexican mothers who completed surveys on acculturation, trait anxiety, anxiety sensitivity, lifetime ATQ/PNRV, psychological distress, and acculturative stress. Results Lifetime PNRV, but not lifetime ATQ, was significantly predictive of psychological distress. PNRV was also linked to trait anxiety. Psychometric measures of anxious predisposition (trait anxiety, anxiety sensitivity) were more robust predictors of distress outcomes than lifetime history of ATQ/PNRV. Conclusions Inquiry into lifetime history of nervios may be a useful point of entry in talking to Mexican immigrant mothers about stress and distress. However, standard tools for assessing anxiety sensitivity and trait anxiety appear most useful in identifying and explaining presence of psychological distress. Further research is needed to determine the cross-cultural relevance of trait anxiety and anxiety sensitivity, and its implications for the development of anxiety treatments that are effective across cultures. PMID:21769996

  9. Beyond anxious predisposition: do padecer de nervios and ataque de nervios add incremental validity to predictions of current distress among Mexican mothers?

    PubMed

    Alcántara, Carmela; Abelson, James L; Gone, Joseph P

    2012-01-01

    Nervios (PNRV) and ataque de nervios (ATQ) are culture-bound syndromes with overlapping symptoms of anxiety, depression, and dissociation, shown to have inconsistent associations to psychiatric disorder. Few studies test the basic assumption that PNRV and ATQ are uniformly linked to distress outcomes across Latina/o immigrant groups. This study examined: (a) the extent to which acculturative stress, Latino/US American acculturation, and anxious predisposition were associated with lifetime history of ATQ and PNRV, and (b) the extent to which ATQ and PNRV add incremental validity in explaining acculturative stress and psychological distress beyond measures of anxious predisposition. Participants (n = 82) included Mexican mothers who completed surveys on acculturation, trait anxiety, anxiety sensitivity, lifetime ATQ/PNRV, psychological distress, and acculturative stress. Lifetime PNRV, but not lifetime ATQ, was significantly predictive of psychological distress. PNRV was also linked to trait anxiety. Psychometric measures of anxious predisposition (trait anxiety and anxiety sensitivity) were more robust predictors of distress outcomes than lifetime history of ATQ/PNRV. Inquiry into lifetime history of nervios may be a useful point of entry in talking to Mexican immigrant mothers about stress and distress. However, standard tools for assessing anxiety sensitivity and trait anxiety appear most useful in identifying and explaining the presence of psychological distress. Further research is needed to determine the cross-cultural relevance of trait anxiety and anxiety sensitivity, and its implications for the development of anxiety treatments that are effective across cultures. © 2011 Wiley-Liss, Inc.

  10. Thin-layer chromatography and colorimetric analysis of multi-component explosive mixtures

    DOEpatents

    Pagoria, Philip F.; Mitchell, Alexander R.; Whipple, Richard E.; Carman, M. Leslie

    2014-08-26

    A thin-layer chromatography method for detection and identification of common military and peroxide explosives in samples includes the steps of provide a reverse-phase thin-layer chromatography plate; prepare the plate by marking spots on which to deposit the samples by touching the plate with a marker; spot one micro liter of a first standard onto one of the spots, spot one micro liter of a second standard onto another of the spots, and spot samples onto other of spots producing a spotted plate; add eluent to a developing chamber; add the spotted plate to the developing chamber; remove the spotted plate from the developing chamber producing a developed plate; place the developed plate in an ultraviolet light box; add a visualization agent to a dip tank; dip the developed plate in the dip tank and remove the developed plate quickly; and detect explosives by viewing said developed plate.

  11. Coupling analysis of high Q resonators in add-drop configuration through cavity ringdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Frigenti, G.; Arjmand, M.; Barucci, A.; Baldini, F.; Berneschi, S.; Farnesi, D.; Gianfreda, M.; Pelli, S.; Soria, S.; Aray, A.; Dumeige, Y.; Féron, P.; Nunzi Conti, G.

    2018-06-01

    An original method able to fully characterize high-Q resonators in an add-drop configuration has been implemented. The method is based on the study of two cavity ringdown (CRD) signals, which are produced at the transmission and drop ports by wavelength sweeping a resonance in a time interval comparable with the photon cavity lifetime. All the resonator parameters can be assessed with a single set of simultaneous measurements. We first developed a model describing the two CRD output signals and a fitting program able to deduce the key parameters from the measured profiles. We successfully validated the model with an experiment based on a fiber ring resonator of known characteristics. Finally, we characterized a high-Q, home-made, MgF2 whispering gallery mode disk resonator in the add-drop configuration, assessing its intrinsic and coupling parameters.

  12. Performance of Hippocampus Volumetry with FSL-FIRST for Prediction of Alzheimer's Disease Dementia in at Risk Subjects with Amnestic Mild Cognitive Impairment.

    PubMed

    Suppa, Per; Hampel, Harald; Kepp, Timo; Lange, Catharina; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2016-01-01

    MRI-based hippocampus volume, a core feasible biomarker of Alzheimer's disease (AD), is not yet widely used in clinical patient care, partly due to lack of validation of software tools for hippocampal volumetry that are compatible with routine workflow. Here, we evaluate fully-automated and computationally efficient hippocampal volumetry with FSL-FIRST for prediction of AD dementia (ADD) in subjects with amnestic mild cognitive impairment (aMCI) from phase 1 of the Alzheimer's Disease Neuroimaging Initiative. Receiver operating characteristic analysis of FSL-FIRST hippocampal volume (corrected for head size and age) revealed an area under the curve of 0.79, 0.70, and 0.70 for prediction of aMCI-to-ADD conversion within 12, 24, or 36 months, respectively. Thus, FSL-FIRST provides about the same power for prediction of progression to ADD in aMCI as other volumetry methods.

  13. Targeting brains, producing responsibilities: the use of neuroscience within British social policy.

    PubMed

    Broer, Tineke; Pickersgill, Martyn

    2015-05-01

    Concepts and findings 'translated' from neuroscientific research are finding their way into UK health and social policy discourse. Critical scholars have begun to analyse how policies tend to 'misuse' the neurosciences and, further, how these discourses produce unwarranted and individualizing effects, rooted in middle-class values and inducing guilt and anxiety. In this article, we extend such work while simultaneously departing from the normative assumptions implied in the concept of 'misuse'. Through a documentary analysis of UK policy reports focused on the early years, adolescence and older adults, we examine how these employ neuroscientific concepts and consequently (re)define responsibility. In the documents analysed, responsibility was produced in three different but intersecting ways: through a focus on optimisation, self-governance, and vulnerability. Our work thereby adds to social scientific examinations of neuroscience in society that show how neurobiological terms and concepts can be used to construct and support a particular imaginary of citizenship and the role of the state. Neuroscience may be leveraged by policy makers in ways that (potentially) reduce the target of their intervention to the soma, but do so in order to expand the outcome of the intervention to include the enhancement of society writ large. By attending as well to more critical engagements with neuroscience in policy documents, our analysis demonstrates the importance of being mindful of the limits to the deployment of a neurobiological idiom within policy settings. Accordingly, we contribute to increased empirical specificity concerning the impacts and translation of neuroscientific knowledge in contemporary society whilst refusing to take for granted the idea that the neurosciences necessarily have a dominant role (to play). Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. World Energy Projection System Plus (WEPS ): Global Activity Module

    EIA Publications

    2016-01-01

    The World Energy Projection System Plus (WEPS ) is a comprehensive, mid?term energy forecasting and policy analysis tool used by EIA. WEPS projects energy supply, demand, and prices by country or region, given assumptions about the state of various economies, international energy markets, and energy policies. The Global Activity Module (GLAM) provides projections of economic driver variables for use by the supply, demand, and conversion modules of WEPS . GLAM’s baseline economic projection contains the economic assumptions used in WEPS to help determine energy demand and supply. GLAM can also provide WEPS with alternative economic assumptions representing a range of uncertainty about economic growth. The resulting economic impacts of such assumptions are inputs to the remaining supply and demand modules of WEPS .

  15. Lagrangian methods for blood damage estimation in cardiovascular devices--How numerical implementation affects the results.

    PubMed

    Marom, Gil; Bluestein, Danny

    2016-01-01

    This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed.

  16. Analysis of environmental regulatory proposals: Its your chance to influence policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veil, J.A.

    1994-03-02

    As part of the regulatory development process, the US Envirorunental Protection Agency (EPA) collects data, makes various assumptions about the data, and analyzes the data. Although EPA acts in good faith, the agency cannot always be aware of all relevant data, make only appropriate assumptions, and use applicable analytical methods. Regulated industries must carefully must carefully review every component of the regulatory decision-making process to identify misunderstandings and errors and to supply additional data that is relevant to the regulatory action. This paper examines three examples of how EPA`s data, assumptions, and analytical methods have been critiqued. The first twomore » examples involve EPA`s cost-effectiveness (CE) analyses prepared for the offshore oil and gas effluent limitations guidelines and as part of EPA Region 6`s general permit for coastal waters of Texas and Louisiana. A CE analysis regulations to the incremental amount of pollutants that would be removed by the recommended treatment processes. The third example, although not involving a CE analysis, demonstrates how the use of non-representative data can influence the outcome of an analysis.« less

  17. Demystifying Welfare: Its Feminization and Its Effect on Stakeholders

    ERIC Educational Resources Information Center

    Hartlep, Nicholas D.

    2008-01-01

    Welfare is misunderstood, mystified, and feminized by many stakeholders (i.e. government, media, majoritarian culture, etc.). This text analysis will assess how well the text achieved the following: (1) articulate why the current U.S. welfare state is based upon myths or false assumptions, (2) analyze what these false assumptions mean for…

  18. On Cognitive Constraints and Learning Progressions: The Case of "Structure of Matter"

    ERIC Educational Resources Information Center

    Talanquer, Vicente

    2009-01-01

    Based on the analysis of available research on students' alternative conceptions about the particulate nature of matter, we identified basic implicit assumptions that seem to constrain students' ideas and reasoning on this topic at various learning stages. Although many of these assumptions are interrelated, some of them seem to change or…

  19. Estimating the Organizational Cost of Sexual Assault in the U.S. Military

    DTIC Science & Technology

    2013-12-01

    22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank...ASSUMPTIONS AND DATA ANALYSIS ..........................35 1. Overall Project Assumptions ............................................................35 2...Overall Project Calculations .............................................................36 a. Calculations Using Data from FY 2012 WGRA Tabular

  20. An identifiable model for informative censoring

    USGS Publications Warehouse

    Link, W.A.; Wegman, E.J.; Gantz, D.T.; Miller, J.J.

    1988-01-01

    The usual model for censored survival analysis requires the assumption that censoring of observations arises only due to causes unrelated to the lifetime under consideration. It is easy to envision situations in which this assumption is unwarranted, and in which use of the Kaplan-Meier estimator and associated techniques will lead to unreliable analyses.

  1. Assumptions Underlying Curriculum Decisions in Australia: An American Perspective.

    ERIC Educational Resources Information Center

    Willis, George

    An analysis of the cultural and historical context in which curriculum decisions are made in Australia and a comparison with educational assumptions in the United States is the purpose of this paper. Methodology is based on personal teaching experience and observation in Australia. Seven factors are identified upon which curricular decisions in…

  2. A general method for handling missing binary outcome data in randomized controlled trials

    PubMed Central

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441

  3. Alternatives for discounting in the analysis of noninferiority trials.

    PubMed

    Snapinn, Steven M

    2004-05-01

    Determining the efficacy of an experimental therapy relative to placebo on the basis of an active-control noninferiority trial requires reference to historical placebo-controlled trials. The validity of the resulting comparison depends on two key assumptions: assay sensitivity and constancy. Since the truth of these assumptions cannot be verified, it seems logical to raise the standard of evidence required to declare efficacy; this concept is referred to as discounting. It is not often recognized that two common design and analysis approaches, setting a noninferiority margin and requiring preservation of a fraction of the standard therapy's effect, are forms of discounting. The noninferiority margin is a particularly poor approach, since its degree of discounting depends on an irrelevant factor. Preservation of effect is more reasonable, but it addresses only the constancy assumption, not the issue of assay sensitivity. Gaining consensus on the most appropriate approach to the design and analysis of noninferiority trials will require a common understanding of the concept of discounting.

  4. Relative Velocity as a Metric for Probability of Collision Calculations

    NASA Technical Reports Server (NTRS)

    Frigm, Ryan Clayton; Rohrbaugh, Dave

    2008-01-01

    Collision risk assessment metrics, such as the probability of collision calculation, are based largely on assumptions about the interaction of two objects during their close approach. Specifically, the approach to probabilistic risk assessment can be performed more easily if the relative trajectories of the two close approach objects are assumed to be linear during the encounter. It is shown in this analysis that one factor in determining linearity is the relative velocity of the two encountering bodies, in that the assumption of linearity breaks down at low relative approach velocities. The first part of this analysis is the determination of the relative velocity threshold below which the assumption of linearity becomes invalid. The second part is a statistical study of conjunction interactions between representative asset spacecraft and the associated debris field environment to determine the likelihood of encountering a low relative velocity close approach. This analysis is performed for both the LEO and GEO orbit regimes. Both parts comment on the resulting effects to collision risk assessment operations.

  5. Techno-economic analysis of concentrated solar power plants in terms of levelized cost of electricity

    NASA Astrophysics Data System (ADS)

    Musi, Richard; Grange, Benjamin; Sgouridis, Sgouris; Guedez, Rafael; Armstrong, Peter; Slocum, Alexander; Calvet, Nicolas

    2017-06-01

    Levelized Cost of Electricity (LCOE) is an important metric which provides one way to compare the economic competitiveness of different electricity generation systems, calculated simply by dividing lifetime costs by lifetime production. Hidden behind the simplicity of this formula are various assumptions which may significantly alter results. Different LCOE studies exist in the literature, although their assumptions are rarely explicitly stated. This analysis gives all formulas and assumptions which allow for inter-study comparisons. The results of this analysis indicate that CSP LCOE is reducing markedly over time and that given the right location and market conditions, the SunShot 6¢/kWh 2020 target can be reached. Increased industrial cooperation is needed to advance the CSP market and continue to drive down LCOE. The results also indicate that there exist a country and technology level learning effect, either when installing an existing CSP technology in a new country or when using a new technology in an existing CSP country, which seems to impact market progress.

  6. Single-cell lineage tracking analysis reveals that an established cell line comprises putative cancer stem cells and their heterogeneous progeny

    PubMed Central

    Sato, Sachiko; Rancourt, Ann; Sato, Yukiko; Satoh, Masahiko S.

    2016-01-01

    Mammalian cell culture has been used in many biological studies on the assumption that a cell line comprises putatively homogeneous clonal cells, thereby sharing similar phenotypic features. This fundamental assumption has not yet been fully tested; therefore, we developed a method for the chronological analysis of individual HeLa cells. The analysis was performed by live cell imaging, tracking of every single cell recorded on imaging videos, and determining the fates of individual cells. We found that cell fate varied significantly, indicating that, in contrast to the assumption, the HeLa cell line is composed of highly heterogeneous cells. Furthermore, our results reveal that only a limited number of cells are immortal and renew themselves, giving rise to the remaining cells. These cells have reduced reproductive ability, creating a functionally heterogeneous cell population. Hence, the HeLa cell line is maintained by the limited number of immortal cells, which could be putative cancer stem cells. PMID:27003384

  7. Environmentally friendly and cost-efficient analysis of aflatoxins in corn

    USDA-ARS?s Scientific Manuscript database

    The extraction procedure adds a significant cost to the overall expense of aflatoxin analysis in agricultural commodities. An inexpensive and low-waste extraction method using a household espresso coffee maker was tested. This appliance was used for the high-temperature /high-pressure extraction of ...

  8. Data Transmission Signal Design and Analysis

    NASA Technical Reports Server (NTRS)

    Moore, J. D.

    1972-01-01

    The error performances of several digital signaling methods are determined as a function of a specified signal-to-noise ratio. Results are obtained for Gaussian noise and impulse noise. Performance of a receiver for differentially encoded biphase signaling is obtained by extending the results of differential phase shift keying. The analysis presented obtains a closed-form answer through the use of some simplifying assumptions. The results give an insight into the analysis problem, however, the actual error performance may show a degradation because of the assumptions made in the analysis. Bipolar signaling decision-threshold selection is investigated. The optimum threshold depends on the signal-to-noise ratio and requires the use of an adaptive receiver.

  9. Constructing normality: a discourse analysis of the DSM-IV.

    PubMed

    Crowe, M

    2000-01-01

    The purpose of this research was to explore how the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) 1994, (American Psychiatric Association, 1994) defines mental disorder and the theoretical assumptions upon which this is based. The analysis examines how the current definition has been constructed and what the criteria for specific mental disorders suggest about what is regarded as normal. The method employed for the research was a critical discourse analysis. This critical approach to research is primarily concerned with analysis of the use of language and the reproduction of dominant belief systems in discourse. It involves systematic and repeated readings of the DSM-IV (1994) to examine what evidence was employed by the text to substantiate its definition of mental disorder and how in the process some assumptions are made about what constitutes normality. This study challenges a central assumption in the DSM-IV's (1994) definition: that it is a pattern or syndrome 'that occurs in an individual'. The proposal that it occurs in an individual implies that it is a consequence of faulty individual functioning. This effectively excludes the social and cultural context in which experiences occur and ignores the role of discourse in shaping subjectivity and social relations. This study proposes that the definition and criteria for mental disorder are based on assumptions about normal behaviour that relate to productivity, unity, moderation and rationality. The influence of this authoritative image of normality pervades many areas of social life and pathologies experiences that could be regarded as responses to life events.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Lijuan; Gonder, Jeff; Burton, Evan

    This study evaluates the costs and benefits associated with the use of a plug-in hybrid electric bus and determines the cost effectiveness relative to a conventional bus and a hybrid electric bus. A sensitivity sweep analysis was performed over a number of a different battery sizes, charging powers, and charging stations. The net present value was calculated for each vehicle design and provided the basis for the design evaluation. In all cases, given present day economic assumptions, the conventional bus achieved the lowest net present value while the optimal plug-in hybrid electric bus scenario reached lower lifetime costs than themore » hybrid electric bus. The study also performed parameter sensitivity analysis under low market potential assumptions and high market potential assumptions. The net present value of plug-in hybrid electric bus is close to that of conventional bus.« less

  11. A closure test for time-specific capture-recapture data

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1999-01-01

    The assumption of demographic closure in the analysis of capture-recapture data under closed-population models is of fundamental importance. Yet, little progress has been made in the development of omnibus tests of the closure assumption. We present a closure test for time-specific data that, in principle, tests the null hypothesis of closed-population model M(t) against the open-population Jolly-Seber model as a specific alternative. This test is chi-square, and can be decomposed into informative components that can be interpreted to determine the nature of closure violations. The test is most sensitive to permanent emigration and least sensitive to temporary emigration, and is of intermediate sensitivity to permanent or temporary immigration. This test is a versatile tool for testing the assumption of demographic closure in the analysis of capture-recapture data.

  12. When Sex and Power Collide: An Argument for Critical Sexuality Studies.

    PubMed

    Fahs, Breanne; McClelland, Sara I

    2016-01-01

    Attentive to the collision of sex and power, we add momentum to the ongoing development of the subfield of critical sexuality studies. We argue that this body of work is defined by its critical orientation toward the study of sexuality, along with a clear allegiance to critical modalities of thought, particularly feminist thought. Critical sexuality studies takes its cues from several other critical moments in related fields, including critical psychology, critical race theory, critical public health, and critical youth studies. Across these varied critical stances is a shared investment in examining how power and privilege operate, understanding the role of historical and epistemological violence in research, and generating new models and paradigms to guide empirical and theoretical research. With this guiding framework, we propose three central characteristics of critical sexuality studies: (a) conceptual analysis, with particular attention to how we define key terms and conceptually organize our research (e.g., attraction, sexually active, consent, agency, embodiment, sexual subjectivity); (b) attention to the material qualities of abject bodies, particularly bodies that are ignored, overlooked, or pushed out of bounds (e.g., viscous bodies, fat bodies, bodies in pain); and (c) heteronormativity and heterosexual privilege, particularly how assumptions about heterosexuality and heteronormativity circulate in sexuality research. Through these three critical practices, we argue that critical sexuality studies showcases how sex and power collide and recognizes (and tries to subvert) the various power imbalances that are deployed and replicated in sex research.

  13. Qualitative research and the epidemiological imagination: a vital relationship.

    PubMed

    Popay, J

    2003-01-01

    This paper takes as its starting point the assumption that the 'Epidemiological Imagination' has a central role to play in the future development of policies and practice to improve population health and reduce health inequalities within and between states but suggests that by neglecting the contribution that qualitative research can make epidemiology is failing to deliver this potential. The paper briefly considers what qualitative research is, touching on epistemological questions--what type of "knowledge" is generated--and questions of methods--what approaches to data collection, analysis and interpretation are involved). Following this the paper presents two different models of the relationship between qualitative and quantitative research. The enhancement model (which assumes that qualitative research findings add something extra to the findings of quantitative research) suggests three related "roles" for qualitative research: generating hypothesis to be tested by quantitative research, helping to construct more sophisticated measures of social phenomena and explaining unexpected research from quantitative research. In contrast, the Epistemological Model suggests that qualitative research is equal but different from quantitative research making a unique contribution through: researching parts other research approaches can't reach, increasing understanding by adding conceptual and theoretical depth to knowledge, shifting the balance of power between researchers and researched and challenging traditional epidemiological ways of "knowing" the social world. The paper illustrates these different types of contributions with examples of qualitative research and finally discusses ways in which the "trustworthiness" of qualitative research can be assessed.

  14. Analysis of silicon on insulator (SOI) optical microring add-drop filter based on waveguide intersections

    NASA Astrophysics Data System (ADS)

    Kaźmierczak, Andrzej; Bogaerts, Wim; Van Thourhout, Dries; Drouard, Emmanuel; Rojo-Romeo, Pedro; Giannone, Domenico; Gaffiot, Frederic

    2008-04-01

    We present a compact passive optical add-drop filter which incorporates two microring resonators and a waveguide intersection in silicon-on-insulator (SOI) technology. Such a filter is a key element for designing simple layouts of highly integrated complex optical networks-on-chip. The filter occupies an area smaller than 10μm×10μm and exhibits relatively high quality factors (up to 4000) and efficient signal dropping capabilities. In the present work, the influence of filter parameters such as the microring-resonators radii and the coupling section shape are analyzed theoretically and experimentally

  15. A five-year model to assess the early cost-effectiveness of new diagnostic tests in the early diagnosis of rheumatoid arthritis.

    PubMed

    Buisman, Leander R; Luime, Jolanda J; Oppe, Mark; Hazes, Johanna M W; Rutten-van Mölken, Maureen P M H

    2016-06-10

    There is a lack of information about the sensitivity, specificity and costs new diagnostic tests should have to improve early diagnosis of rheumatoid arthritis (RA). Our objective was to explore the early cost-effectiveness of various new diagnostic test strategies in the workup of patients with inflammatory arthritis (IA) at risk of having RA. A decision tree followed by a patient-level state transition model, using data from published literature, cohorts and trials, was used to evaluate diagnostic test strategies. Alternative tests were assessed as add-on to or replacement of the ACR/EULAR 2010 RA classification criteria for all patients and for intermediate-risk patients. Tests included B-cell gene expression (sensitivity 0.60, specificity 0.90, costs €150), MRI (sensitivity 0.90, specificity 0.60, costs €756), IL-6 serum level (sensitivity 0.70, specificity 0.53, costs €50) and genetic assay (sensitivity 0.40, specificity 0.85, costs €750). Patients with IA at risk of RA were followed for 5 years using a societal perspective. Guideline treatment was assumed using tight controlled treatment based on DAS28; if patients had a DAS28 >3.2 at 12 months or later patients could be eligible for starting biological drugs. The outcome was expressed in incremental cost-effectiveness ratios (€2014 per quality-adjusted life year (QALY) gained) and headroom. The B-cell test was the least expensive strategy when used as an add-on and as replacement in intermediate-risk patients, making it the dominant strategy, as it has better health outcomes and lower costs. As add-on for all patients, the B-cell test was also the most cost-effective test strategy. When using a willingness-to-pay threshold of €20,000 per QALY gained, the IL-6 and MRI strategies were not cost-effective, except as replacement. A genetic assay was not cost-effective in any strategy. Probabilistic sensitivity analysis revealed that the B-cell test was consistently superior in all strategies. When performing univariate sensitivity analysis for intermediate-risk patients, specificity and DAS28 in the B-cell add-on strategy, and DAS28 and sensitivity in the MRI add-on strategy had the largest impact on the cost-effectiveness. This early cost-effectiveness analysis indicated that new tests to diagnose RA are most likely to be cost-effective when the tests are used as an add-on in intermediate-risk patients, and have high specificity, and the test costs should not be higher than €200-€300.

  16. The Tides--A Neglected Topic.

    ERIC Educational Resources Information Center

    Hartel, Hermann

    2000-01-01

    Finds that computer simulations can be used to visualize the processes involved with lunar tides. Technology adds value, thus opening new paths for a more distinct analysis and increased learning results. (Author/CCM)

  17. Natural gas availability and ambient air quality in the Baton Rouge/New Orleans industrial complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fieler, E.R.; Harrison, D.P.

    1978-02-26

    Three scenarios were modeled for the Baton Rouge/New Orleans area for 1985: one assumes the substitution of residual oil (0.7% sulfur) for gas to decrease gas-burning stationary sources from 80 to 8% and the use of properly designed stacks for large emitters; the second makes identical gas supply assumptions but adds proper stack dispersion for medium as well as large emitters; and the third is based on 16% gas-burning stationary sources. The Climatological Dispersion Model was used to translate (1974) emission rates into ambient air concentrations. Growth rates for residential, commercial, and transportation sources, but not industry, were considered. Themore » results show that proper policies, which would require not only tall stacks for large oil burning units (and for intermediate units also in the areas of high industrial concentration), but also the careful location of new plants would permit continued industrial expansion without severe air pollution problems.« less

  18. Bringing Darwin into the social sciences and the humanities: cultural evolution and its philosophical implications.

    PubMed

    Blancke, Stefaan; Denis, Gilles

    2018-04-10

    In the field of cultural evolution it is generally assumed that the study of culture and cultural change would benefit enormously from being informed by evolutionary thinking. Recently, however, there has been much debate about what this "being informed" means. According to the standard view, an interesting analogy obtains between cultural and biological evolution. In the literature, however, the analogy is interpreted and used in at least three distinct, but interrelated ways. We provide a taxonomy in order to clarify these different meanings. Subsequently, we discuss the alternatives model of cultural attraction theory and memetics, which both challenge basic assumptions of the standard view. Finally, we briefly summarize the contributions to the special issue on Darwin in the Humanities and the Social Sciences, which is the result of a collaborative project between scholars and scientists from the universities of Lille and Ghent. Furthermore, we explain how they add to the discussions about the integration of evolutionary thinking and the study of culture.

  19. Retrospective estimation of the electric and magnetic field exposure conditions in in vitro experimental reports reveal considerable potential for uncertainty.

    PubMed

    Portelli, Lucas A; Falldorf, Karsten; Thuróczy, György; Cuppen, Jan

    2018-04-01

    Experiments on cell cultures exposed to extremely low frequency (ELF, 3-300 Hz) magnetic fields are often subject to multiple sources of uncertainty associated with specific electric and magnetic field exposure conditions. Here we systemically quantify these uncertainties based on exposure conditions described in a group of bioelectromagnetic experimental reports for a representative sampling of the existing literature. The resulting uncertainties, stemming from insufficient, ambiguous, or erroneous description, design, implementation, or validation of the experimental methods and systems, were often substantial enough to potentially make any successful reproduction of the original experimental conditions difficult or impossible. Without making any assumption about the true biological relevance of ELF electric and magnetic fields, these findings suggest another contributing factor which may add to the overall variability and irreproducibility traditionally associated with experimental results of in vitro exposures to low-level ELF magnetic fields. Bioelectromagnetics. 39:231-243, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. Quadratic obstructions to small-time local controllability for scalar-input systems

    NASA Astrophysics Data System (ADS)

    Beauchard, Karine; Marbach, Frédéric

    2018-03-01

    We consider nonlinear finite-dimensional scalar-input control systems in the vicinity of an equilibrium. When the linearized system is controllable, the nonlinear system is smoothly small-time locally controllable: whatever m > 0 and T > 0, the state can reach a whole neighborhood of the equilibrium at time T with controls arbitrary small in Cm-norm. When the linearized system is not controllable, we prove that: either the state is constrained to live within a smooth strict manifold, up to a cubic residual, or the quadratic order adds a signed drift with respect to it. This drift holds along a Lie bracket of length (2 k + 1), is quantified in terms of an H-k-norm of the control, holds for controls small in W 2 k , ∞-norm and these spaces are optimal. Our proof requires only C3 regularity of the vector field. This work underlines the importance of the norm used in the smallness assumption on the control, even in finite dimension.

  1. Unraveling the principles of auditory cortical processing: can we learn from the visual system?

    PubMed Central

    King, Andrew J; Nelken, Israel

    2013-01-01

    Studies of auditory cortex are often driven by the assumption, derived from our better understanding of visual cortex, that basic physical properties of sounds are represented there before being used by higher-level areas for determining sound-source identity and location. However, we only have a limited appreciation of what the cortex adds to the extensive subcortical processing of auditory information, which can account for many perceptual abilities. This is partly because of the approaches that have dominated the study of auditory cortical processing to date, and future progress will unquestionably profit from the adoption of methods that have provided valuable insights into the neural basis of visual perception. At the same time, we propose that there are unique operating principles employed by the auditory cortex that relate largely to the simultaneous and sequential processing of previously derived features and that therefore need to be studied and understood in their own right. PMID:19471268

  2. Wireless Sensor Needs in the Space Shuttle and CEV Structures Communities

    NASA Technical Reports Server (NTRS)

    James, George H., III

    2007-01-01

    This presentation will clarify some of the structural measurement needs of NASA's Space Shuttle and Crew Exploration Vehicles. Emerging technologies in wireless sensor systems can be of some advantage in both Programs. The presentation will address how wireless instrumentation has helped in the past and what has gone unmeasured on Shuttle due to various limitations. Finally, it will address the needs of the CEV program that can be met with reliable wireless systems, if modular avionics interfaces are provided to accommodate the usual evolving needs of an ambitious space vehicle development program. Examples of the advantages of flight data to support flight certification engineering analyses and of areas where add-on wireless instrumentation can be used will be shown. Without flight instrumentation, it is necessary to retain the conservative assumptions used in the design process. It will be shown how the lessons learned on Space Shuttle for wired and wireless structural measurements apply to the Orion Crew Exploration Vehicle (CEV), which is currently being designed.

  3. Artificial Life Art, Creativity, and Techno-hybridization (editor's introduction).

    PubMed

    Dorin, Alan

    2015-01-01

    Artists and engineers have devised lifelike technology for millennia. Their ingenious devices have often prompted inquiry into our preferences, prejudices, and beliefs about living systems, especially regarding their origins, status, constitution, and behavior. A recurring fabrication technique is shared across artificial life art, science, and engineering. This involves aggregating representations or re-creations of familiar biological parts-techno-hybridization-but the motives of practitioners may differ markedly. This article, and the special issue it introduces, explores how ground familiar to contemporary artificial life science and engineering has been assessed and interpreted in parallel by (a) artists and (b) theorists studying creativity explicitly. This activity offers thoughtful, alternative perspectives on artificial life science and engineering, highlighting and sometimes undermining the fields' underlying assumptions, or exposing avenues that are yet to be explored outside of art. Additionally, art has the potential to engage the general public, supporting and exploring the findings of scientific research and engineering. This adds considerably to the maturity of a culture tackling the issues the discipline of artificial life raises.

  4. On the inner disc structure of MWC480: evidence for asymmetries?

    NASA Astrophysics Data System (ADS)

    Jamialahmadi, N.; Lopez, B.; Berio, Ph.; Matter, A.; Flament, S.; Fathivavsari, H.; Ratzka, T.; Sitko, M. L.; Spang, A.; Russell, R. W.

    2018-01-01

    Studying the physical conditions structuring the young circumstellar discs is required for understanding the onset of planet formation. Of particular interest is the protoplanetary disc surrounding the Herbig star MWC480. The structure and properties of the circumstellar disc of MWC480 are studied by infrared interferometry and interpreted from a modelling approach. New observations are driving this study, in particular, some recent Very Large Telescope Interferometer (VLTI)/MIDI data acquired in 2013 December. Our one-component disc model could not reproduce simultaneously all our data: the spectral energy distribution, the near-infrared Keck Interferometer data and the mid-infrared data obtained with the MIDI instrument. In order to explain all measurements, one possibility is to add an asymmetry in our one-component disc model with the assumption that the structure of the disc of MWC480 has not varied with time. Several scenarios are tested, and the one considering the presence of an azimuthal bright feature in the inner component of the disc model provides a better fit of the data.

  5. A Prospective Examination of Perceived Burdensomeness and Thwarted Belongingness As Risk Factors for Suicide Ideation In Adult Outpatients Receiving Cognitive-Behavioral Therapy.

    PubMed

    Teismann, Tobias; Glaesmer, Heide; von Brachel, Ruth; Siegmann, Paula; Forkmann, Thomas

    2017-10-01

    The interpersonal-psychological theory of suicidal behavior posits that 2 proximal, causal, and interactive risk factors must be present for someone to desire suicide: perceived burdensomeness and thwarted belongingness. The purpose of the present study was to evaluate the predictive power of these 2 risk factors in a prospective study. A total of 231 adult outpatients (age: mean = 38.1, standard deviation = 12.3) undergoing cognitive-behavioral therapy took part in a pretreatment and a midtreatment assessment after the 10th therapy session. Perceived burdensomeness, thwarted belongingness, and the interaction between these 2 risk factors did not add incremental variance to the prediction of midtreatment suicide ideation after controlling for age, gender, depression, hopelessness, impulsivity, lifetime suicide attempts, and pretreatment suicide ideation. The best predictor of midtreatment suicide ideation was pretreatment suicide ideation. Results offer only limited support to the assumptions of the interpersonal theory of suicide. © 2017 Wiley Periodicals, Inc.

  6. Cost-benefit analysis of riparian protection in an eastern Canadian watershed.

    PubMed

    Trenholm, Ryan; Lantz, Van; Martínez-Espiñeira, Roberto; Little, Shawn

    2013-02-15

    Forested riparian buffers have proved to be an effective management practice that helps maintain ecological goods and services in watersheds. In this study, we assessed the non-market benefits and opportunity costs associated with implementing these buffers in an eastern Canadian watershed using contingent valuation and wood supply modeling methods, respectively. A number of buffer scenarios were considered, including 30 and 60 m buffers on woodlots and on all land (including woodlots, agricultural, and residential lands) in the watershed. Household annual WTP estimates ranged from -$6.80 to $42.85, and total present value benefits ranged from -$11.7 to $121.7 million (CDN 2007), depending on the buffer scenario, affected population, time horizon, and econometric modeling assumptions considered. Opportunity cost estimates range from $1.3 to $10.4 million in present value terms, depending on silvicultural and agriculture land rental rate assumptions. Overall, we found that the net present value of riparian buffers was positive for the majority of scenarios and assumptions. Some exceptions were found under more conservative benefit, and higher unit cost, assumptions. These results provide decision makers with data on stated benefits and opportunity costs of riparian buffers, as well as insight into the importance of modeling assumptions when using this framework of analysis. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. The Constitution of Outdoor Education Groups: An Analysis of the Literature?

    ERIC Educational Resources Information Center

    Zink, Robyn

    2010-01-01

    Groups are ubiquitous in outdoor education and while there is a lot of literature on groups, there is limited examination of the assumptions made about groups and the effects these assumptions have on the practices of outdoor education. I utilise some of Michel Foucault's (1992) tools to investigate literature on outdoor education groups.…

  8. Consistent Tolerance Bounds for Statistical Distributions

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1983-01-01

    Assumption that sample comes from population with particular distribution is made with confidence C if data lie between certain bounds. These "confidence bounds" depend on C and assumption about distribution of sampling errors around regression line. Graphical test criteria using tolerance bounds are applied in industry where statistical analysis influences product development and use. Applied to evaluate equipment life.

  9. Modeling the Structural Dynamic of Industrial Networks

    NASA Astrophysics Data System (ADS)

    Wilkinson, Ian F.; Wiley, James B.; Lin, Aizhong

    Market systems consist of locally interacting agents who continuously pursue advantageous opportunities. Since the time of Adam Smith, a fundamental task of economics has been to understand how market systems develop and to explain their operation. During the intervening years, theory largely has stressed comparative statics analysis. Based on the assumptions of rational, utility or profit-maximizing agents, and negative, diminishing returns) feedback process, traditional economic analysis seeks to describe the, generally) unique state of an economy corresponding to an initial set of assumptions. The analysis is tatic in the sense that it does not describe the process by which an economy might get from one state to another.

  10. Lagrangian methods for blood damage estimation in cardiovascular devices - How numerical implementation affects the results

    PubMed Central

    Marom, Gil; Bluestein, Danny

    2016-01-01

    Summary This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed. PMID:26679833

  11. [Medical deontology--historical study].

    PubMed

    Wieckowska, Elzbieta

    2003-01-01

    The subject of the paper was to present selected publications concerning the medical deontology. Special attention was paid on three of them. Well-known publications Hippocrates' oath formulated in 5th/4th century BC, Majmonides' prayer (12th century) and Polish medical deontology code published in 1994 underwent a comparative analysis. The objective of the analysis was the description of the similarities and differences in the assumptions constituting in the fundamentals of medical deontology. Its formulated in almost one thousand year intervals, as well as assumptions comparison of Polish and universal medical deontology.

  12. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonen, E.P.; Johnson, K.I.; Simonen, F.A.

    The Vessel Integrity Simulation Analysis (VISA-II) code was developed to allow calculations of the failure probability of a reactor pressure vessel subject to defined pressure/temperature transients. A version of the code, revised by Pacific Northwest Laboratory for the US Nuclear Regulatory Commission, was used to evaluate the sensitivities of calculated through-wall flaw probability to material, flaw and calculational assumptions. Probabilities were more sensitive to flaw assumptions than to material or calculational assumptions. Alternative flaw assumptions changed the probabilities by one to two orders of magnitude, whereas alternative material assumptions typically changed the probabilities by a factor of two or less.more » Flaw shape, flaw through-wall position and flaw inspection were sensitivities examined. Material property sensitivities included the assumed distributions in copper content and fracture toughness. Methods of modeling flaw propagation that were evaluated included arrest/reinitiation toughness correlations, multiple toughness values along the length of a flaw, flaw jump distance for each computer simulation and added error in estimating irradiated properties caused by the trend curve correlation error.« less

  14. Experiments on Nucleation in Different Flow Regimes

    NASA Technical Reports Server (NTRS)

    Bayuzick, R. J.; Hofmeister, W. H.; Morton, C. M.; Robinson, M. B.

    1999-01-01

    The vast majority of metallic engineering materials are solidified from the liquid phase. Understanding the solidification process is essential to control microstructure, which in turn, determines the properties of materials. The genesis of solidification is nucleation, where the first stable solid forms from the liquid phase. Nucleation kinetics determine the degree of undercooling and phase selection. As such, it is important to understand nucleation phenomena in order to control solidification or glass formation in metals and alloys. Early experiments in nucleation kinetics were accomplished by droplet dispersion methods. Dilatometry was used by Turnbull and others, and more recently differential thermal analysis and differential scanning calorimetry have been used for kinetic studies. These techniques have enjoyed success; however, there are difficulties with these experiments. Since materials are dispersed in a medium, the character of the emulsion/metal interface affects the nucleation behavior. Statistics are derived from the large number of particles observed in a single experiment, but dispersions have a finite size distribution which adds to the uncertainty of the kinetic determinations. Even though temperature can be controlled quite well before the onset of nucleation, the release of the latent heat of fusion during nucleation of particles complicates the assumption of isothermality during these experiments. Containerless processing has enabled another approach to the study of nucleation kinetics. With levitation techniques it is possible to undercool one sample to nucleation repeatedly in a controlled manner, such that the statistics of the nucleation process can be derived from multiple experiments on a single sample. The authors have fully developed the analysis of nucleation experiments on single samples following the suggestions of Skripov. The advantage of these experiments is that the samples are directly observable. The nucleation temperature can be measured by noncontact optical pyrometry, the mass of the sample is known, and post processing analysis can be conducted on the sample. The disadvantages are that temperature measurement must have exceptionally high precision, and it is not possible to isolate specific heterogeneous sites as in droplet dispersions.

  15. Invited Commentary: Agent-Based Models-Bias in the Face of Discovery.

    PubMed

    Keyes, Katherine M; Tracy, Melissa; Mooney, Stephen J; Shev, Aaron; Cerdá, Magdalena

    2017-07-15

    Agent-based models (ABMs) have grown in popularity in epidemiologic applications, but the assumptions necessary for valid inference have only partially been articulated. In this issue, Murray et al. (Am J Epidemiol. 2017;186(2):131-142) provided a much-needed analysis of the consequence of some of these assumptions, comparing analysis using an ABM to a similar analysis using the parametric g-formula. In particular, their work focused on the biases that can arise in ABMs that use parameters drawn from distinct populations whose causal structures and baseline outcome risks differ. This demonstration of the quantitative issues that arise in transporting effects between populations has implications not only for ABMs but for all epidemiologic applications, because making use of epidemiologic results requires application beyond a study sample. Broadly, because health arises within complex, dynamic, and hierarchical systems, many research questions cannot be answered statistically without strong assumptions. It will require every tool in our store of methods to properly understand population dynamics if we wish to build an evidence base that is adequate for action. Murray et al.'s results provide insight into these assumptions that epidemiologists can use when selecting a modeling approach. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Antibiotics: MedlinePlus Health Topic

    MedlinePlus

    ... or not using them properly, can add to antibiotic resistance . This happens when bacteria change and become able ... ports Pseudomembranous colitis Sensitivity analysis Related Health Topics Antibiotic Resistance Bacterial Infections Medicines National Institutes of Health The ...

  17. Authoritarian Parenting and Asian Adolescent School Performance: Insights from the US and Taiwan

    PubMed Central

    Pong, Suet-ling; Johnston, Jamie; Chen, Vivien

    2014-01-01

    Our study re-examines the relationship between parenting and school performance among Asian students. We use two sources of data: wave I of the Adolescent Health Longitudinal Survey (Add Health), and waves I and II of the Taiwan Educational Panel Survey (TEPS). Analysis using Add Health reveals that the Asian-American/European-American difference in the parenting–school performance relationship is due largely to differential sample sizes. When we select a random sample of European-American students comparable to the sample size of Asian-American students, authoritarian parenting also shows no effect for European-American students. Furthermore, analysis of TEPS shows that authoritarian parenting is negatively associated with children's school achievement, while authoritative parenting is positively associated. This result for Taiwanese Chinese students is similar to previous results for European-American students in the US. PMID:24850978

  18. Authoritarian Parenting and Asian Adolescent School Performance: Insights from the US and Taiwan.

    PubMed

    Pong, Suet-Ling; Johnston, Jamie; Chen, Vivien

    2010-01-01

    Our study re-examines the relationship between parenting and school performance among Asian students. We use two sources of data: wave I of the Adolescent Health Longitudinal Survey (Add Health), and waves I and II of the Taiwan Educational Panel Survey (TEPS). Analysis using Add Health reveals that the Asian-American/European-American difference in the parenting-school performance relationship is due largely to differential sample sizes. When we select a random sample of European-American students comparable to the sample size of Asian-American students, authoritarian parenting also shows no effect for European-American students. Furthermore, analysis of TEPS shows that authoritarian parenting is negatively associated with children's school achievement, while authoritative parenting is positively associated. This result for Taiwanese Chinese students is similar to previous results for European-American students in the US.

  19. Confirmatory analysis of 17beta-boldenone, 17alpha-boldenone and androsta-1,4-diene-3,17-dione in bovine urine by liquid chromatography-tandem mass spectrometry.

    PubMed

    Draisci, Rosa; Palleschi, Luca; Ferretti, Emanuele; Lucentini, Luca; delli Quadri, Fernanda

    2003-06-15

    A sensitive and selective liquid chromatography-tandem mass spectrometry (LC-MS-MS) method for confirmatory analysis of 17beta-boldenone (17beta-BOL), 17alpha-boldenone (17alpha-BOL) and androsta-1,4-diene-3,17-dione (ADD) in bovine urine was developed. [2H(2)]17beta-Testosterone (17beta-T-d(2)) was used as the internal standard. Sample preparation involved enzymatic hydrolysis and purification on a C(18) solid-phase extraction column. Chromatographic separation of the analytes was obtained using an RP-C(18) HPLC column. LC-MS-MS detection was carried out with an atmospheric pressure chemical ionisation (APCI) source equipped with a heated nebulizer (HN) interface operating in the positive ion mode. For unambiguous hormone confirmation, three analyte precursor-product ion combinations were monitored during multiple-reaction monitoring (MRM) LC-MS-MS analysis. Overall recovery (%), repeatability (relative standard deviations, RSD, %) and within-laboratory reproducibility (RSD, %) ranged from 92.2 to 97.7%, from 6.50 to 2.94% and from 13.50 to 5.04%, respectively, for all analytes. The limit of quantification in bovine urine was 0.20 ng ml(-1) for 17beta-BOL and ADD and 0.50 ng ml(-1) for 17alpha-BOL. The validated method was successfully applied for determination of 17beta-BOL, 17alpha-BOL and ADD in a large number of bovine urine samples collected within the national Official Residue Control Program.

  20. Understanding Business Interests in International Large-Scale Student Assessments: A Media Analysis of "The Economist," "Financial Times," and "Wall Street Journal"

    ERIC Educational Resources Information Center

    Steiner-Khamsi, Gita; Appleton, Margaret; Vellani, Shezleen

    2018-01-01

    The media analysis is situated in the larger body of studies that explore the varied reasons why different policy actors advocate for international large-scale student assessments (ILSAs) and adds to the research on the fast advance of the global education industry. The analysis of "The Economist," "Financial Times," and…

  1. Linkage analysis of systolic blood pressure: a score statistic and computer implementation

    PubMed Central

    Wang, Kai; Peng, Yingwei

    2003-01-01

    A genome-wide linkage analysis was conducted on systolic blood pressure using a score statistic. The randomly selected Replicate 34 of the simulated data was used. The score statistic was applied to the sibships derived from the general pedigrees. An add-on R program to GENEHUNTER was developed for this analysis and is freely available. PMID:14975145

  2. Innovation or 'Inventions'? The conflict between latent assumptions in marine aquaculture and local fishery.

    PubMed

    Martínez-Novo, Rodrigo; Lizcano, Emmánuel; Herrera-Racionero, Paloma; Miret-Pastor, Lluís

    2018-02-01

    Recent European policy highlights the need to promote local fishery and aquaculture by means of innovation and joint participation in fishery management as one of the keys to achieve the sustainability of our seas. However, the implicit assumptions held by the actors in the two main groups involved - innovators (scientists, businessmen and administration managers) and local fishermen - can complicate, perhaps even render impossible, mutual understanding and co-operation. A qualitative analysis of interviews with members of both groups in the Valencian Community (Spain) reveals those latent assumptions and their impact on the respective practices. The analysis shows that the innovation narrative in which one group is based and the inventions narrative used by the other one are rooted in two dramatically different, or even antagonistic, collective worldviews. Any environmental policy that implies these groups should take into account these strong discords.

  3. Comparing Indirect Effects in Different Groups in Single-Group and Multi-Group Structural Equation Models

    PubMed Central

    Ryu, Ehri; Cheong, Jeewon

    2017-01-01

    In this article, we evaluated the performance of statistical methods in single-group and multi-group analysis approaches for testing group difference in indirect effects and for testing simple indirect effects in each group. We also investigated whether the performance of the methods in the single-group approach was affected when the assumption of equal variance was not satisfied. The assumption was critical for the performance of the two methods in the single-group analysis: the method using a product term for testing the group difference in a single path coefficient, and the Wald test for testing the group difference in the indirect effect. Bootstrap confidence intervals in the single-group approach and all methods in the multi-group approach were not affected by the violation of the assumption. We compared the performance of the methods and provided recommendations. PMID:28553248

  4. A Backscattering Enhanced Microwave Canopy Scattering Model Based On MIMICS

    NASA Astrophysics Data System (ADS)

    Shen, X.; Hong, Y.; Qin, Q.; Chen, S.; Grout, T.

    2010-12-01

    For modeling microwave scattering of vegetated areas, several microwave canopy scattering models, based on the vectorized radiative transfer equation (VRT) that use different solving techniques, have been proposed in the past three decades. As an iterative solution of VRT at low orders, the Michigan Microwave Canopy Scattering Model (MIMICS) gives an analytical expression for calculating scattering as long as the volume scattering is not too strong. The most important usage of such models is to predict scattering in the backscattering direction. Unfortunately, the simplified assumption of MIMICS is that the scattering between the ground and trunk layers only includes the specular reflection. As a result, MIMICS includes a dominant coherent term which vanishes in the backscattering direction because this term contains a delta function factor of zero in this direction. This assumption needs reconsideration for accurately calculating the backscattering. In the framework of MIMICS, any incoherent terms that involve surface scattering factors must at least undergo surface scattering twice and volume scattering once. Therefore, these incoherent terms are usually very weak. On the other hand, due to the phenomenon of backscattering enhancement, the surface scattering in the backscattering direction is very strong compared to most other directions. Considering the facts discussed above, it is reasonable to add a surface backscattering term to the last equation of the boundary conditions of MIMICS. More terms appear in the final result including a backscattering coherent term which enhances the backscattering. The modified model is compared with the original MIMICS (version 1.0) using JPL/AIRSAR data from NASA Campaign Soil Moisture Experimental 2003 (SMEX03) and Washita92. Significant improvement is observed.

  5. Reporting the national antimicrobial consumption in Danish pigs: influence of assigned daily dosage values and population measurement.

    PubMed

    Dupont, Nana; Fertner, Mette; Kristensen, Charlotte Sonne; Toft, Nils; Stege, Helle

    2016-05-03

    Transparent calculation methods are crucial when investigating trends in antimicrobial consumption over time and between populations. Until 2011, one single standardized method was applied when quantifying the Danish pig antimicrobial consumption with the unit "Animal Daily Dose" (ADD). However, two new methods for assigning values for ADDs have recently emerged, one implemented by DANMAP, responsible for publishing annual reports on antimicrobial consumption, and one by the Danish Veterinary and Food Administration (DVFA), responsible for the Yellow Card initiative. In addition to new ADD assignment methods, Denmark has also experienced a shift in the production pattern, towards a larger export of live pigs. The aims of this paper were to (1) describe previous and current ADD assignment methods used by the major Danish institutions and (2) to illustrate how ADD assignment method and choice of population and population measurement affect the calculated national antimicrobial consumption in pigs (2007-2013). The old VetStat ADD-values were based on SPCs in contrast to the new ADD-values, which were based on active compound, concentration and administration route. The new ADD-values stated by both DANMAP and DVFA were only identical for 48 % of antimicrobial products approved for use in pigs. From 2007 to 2013, the total number of ADDs per year increased by 9 % when using the new DVFA ADD-values, but decreased by 2 and 7 % when using the new DANMAP ADD-values or the old VetStat ADD-values, respectively. Through 2007 to 2013, the production of pigs increased from 26.1 million pigs per year with 18 % exported live to 28.7 million with 34 % exported live. In the same time span, the annual pig antimicrobial consumption increased by 22.2 %, when calculated using the new DVFA ADD-values and pigs slaughtered per year as population measurement (13.0 ADDs/pig/year to 15.9 ADDs/pig/year). However, when based on the old VetStat ADD values and pigs produced per year (including live export), a 10.9 % decrease was seen (10.6 ADDs/pig/year to 9.4 ADDs/pig/year). The findings of this paper clearly highlight that calculated national antimicrobial consumption is highly affected by chosen population measurement and the applied ADD-values.

  6. The competing risks Cox model with auxiliary case covariates under weaker missing-at-random cause of failure.

    PubMed

    Nevo, Daniel; Nishihara, Reiko; Ogino, Shuji; Wang, Molin

    2017-08-04

    In the analysis of time-to-event data with multiple causes using a competing risks Cox model, often the cause of failure is unknown for some of the cases. The probability of a missing cause is typically assumed to be independent of the cause given the time of the event and covariates measured before the event occurred. In practice, however, the underlying missing-at-random assumption does not necessarily hold. Motivated by colorectal cancer molecular pathological epidemiology analysis, we develop a method to conduct valid analysis when additional auxiliary variables are available for cases only. We consider a weaker missing-at-random assumption, with missing pattern depending on the observed quantities, which include the auxiliary covariates. We use an informative likelihood approach that will yield consistent estimates even when the underlying model for missing cause of failure is misspecified. The superiority of our method over naive methods in finite samples is demonstrated by simulation study results. We illustrate the use of our method in an analysis of colorectal cancer data from the Nurses' Health Study cohort, where, apparently, the traditional missing-at-random assumption fails to hold.

  7. Comparison of Factor Simplicity Indices for Dichotomous Data: DETECT R, Bentler's Simplicity Index, and the Loading Simplicity Index

    ERIC Educational Resources Information Center

    Finch, Holmes; Stage, Alan Kirk; Monahan, Patrick

    2008-01-01

    A primary assumption underlying several of the common methods for modeling item response data is unidimensionality, that is, test items tap into only one latent trait. This assumption can be assessed several ways, using nonlinear factor analysis and DETECT, a method based on the item conditional covariances. When multidimensionality is identified,…

  8. A Signal-Detection Analysis of Fast-and-Frugal Trees

    ERIC Educational Resources Information Center

    Luan, Shenghua; Schooler, Lael J.; Gigerenzer, Gerd

    2011-01-01

    Models of decision making are distinguished by those that aim for an optimal solution in a world that is precisely specified by a set of assumptions (a so-called "small world") and those that aim for a simple but satisfactory solution in an uncertain world where the assumptions of optimization models may not be met (a so-called "large world"). Few…

  9. Empirical Benchmarks of Hidden Bias in Educational Research: Implication for Assessing How well Propensity Score Methods Approximate Experiments and Conducting Sensitivity Analysis

    ERIC Educational Resources Information Center

    Dong, Nianbo; Lipsey, Mark

    2014-01-01

    When randomized control trials (RCT) are not feasible, researchers seek other methods to make causal inference, e.g., propensity score methods. One of the underlined assumptions for the propensity score methods to obtain unbiased treatment effect estimates is the ignorability assumption, that is, conditional on the propensity score, treatment…

  10. Why Bother about Writing a Masters Dissertation? Assumptions of Faculty and Masters Students in an Iranian Setting

    ERIC Educational Resources Information Center

    Hasrati, Mostafa

    2013-01-01

    This article reports the results of a mixed methodology analysis of the assumptions of academic staff and Masters students in an Iranian university regarding various aspects of the assessment of the Masters degree thesis, including the main objective for writing the thesis, the role of the students, supervisors and advisors in writing the…

  11. Identifying gaps in conservation networks: of indicators and uncertainty in geographic-based analyses

    Treesearch

    Curtis H. Flather; Kenneth R. Wilson; Denis J. Dean; William C. McComb

    1997-01-01

    Mapping of biodiversity elements to expose gaps in. conservation networks has become a common strategy in nature-reserve design. We review a set of critical assumptions and issues that influence the interpretation and implementation of gap analysis, including: (1) the assumption that a subset of taxa can be used to indicate overall diversity patterns, and (2) the...

  12. A general method for handling missing binary outcome data in randomized controlled trials.

    PubMed

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-12-01

    The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. We propose a sensitivity analysis where standard analyses, which could include 'missing = smoking' and 'last observation carried forward', are embedded in a wider class of models. We apply our general method to data from two smoking cessation trials. A total of 489 and 1758 participants from two smoking cessation trials. The abstinence outcomes were obtained using telephone interviews. The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. © 2014 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  13. Voice-onset time and buzz-onset time identification: A ROC analysis

    NASA Astrophysics Data System (ADS)

    Lopez-Bascuas, Luis E.; Rosner, Burton S.; Garcia-Albea, Jose E.

    2004-05-01

    Previous studies have employed signal detection theory to analyze data from speech and nonspeech experiments. Typically, signal distributions were assumed to be Gaussian. Schouten and van Hessen [J. Acoust. Soc. Am. 104, 2980-2990 (1998)] explicitly tested this assumption for an intensity continuum and a speech continuum. They measured response distributions directly and, assuming an interval scale, concluded that the Gaussian assumption held for both continua. However, Pastore and Macmillan [J. Acoust. Soc. Am. 111, 2432 (2002)] applied ROC analysis to Schouten and van Hessen's data, assuming only an ordinal scale. Their ROC curves suppported the Gaussian assumption for the nonspeech signals only. Previously, Lopez-Bascuas [Proc. Audit. Bas. Speech Percept., 158-161 (1997)] found evidence with a rating scale procedure that the Gaussian model was inadequate for a voice-onset time continuum but not for a noise-buzz continuum. Both continua contained ten stimuli with asynchronies ranging from -35 ms to +55 ms. ROC curves (double-probability plots) are now reported for each pair of adjacent stimuli on the two continua. Both speech and nonspeech ROCs often appeared nonlinear, indicating non-Gaussian signal distributions under the usual zero-variance assumption for response criteria.

  14. Influence of 2D Finite Element Modeling Assumptions on Debonding Prediction for Composite Skin-stiffener Specimens Subjected to Tension and Bending

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Minguet, Pierre J.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The influence of two-dimensional finite element modeling assumptions on the debonding prediction for skin-stiffener specimens was investigated. Geometrically nonlinear finite element analyses using two-dimensional plane-stress and plane strain elements as well as three different generalized plane strain type approaches were performed. The computed deflections, skin and flange strains, transverse tensile stresses and energy release rates were compared to results obtained from three-dimensional simulations. The study showed that for strains and energy release rate computations the generalized plane strain assumptions yielded results closest to the full three-dimensional analysis. For computed transverse tensile stresses the plane stress assumption gave the best agreement. Based on this study it is recommended that results from plane stress and plane strain models be used as upper and lower bounds. The results from generalized plane strain models fall between the results obtained from plane stress and plane strain models. Two-dimensional models may also be used to qualitatively evaluate the stress distribution in a ply and the variation of energy release rates and mixed mode ratios with lamination length. For more accurate predictions, however, a three-dimensional analysis is required.

  15. Statistical Issues for Calculating Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Matney, Mark; Bacon, John

    2016-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering object to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. This information, combined with information on the expected ground path of the reentry, is used to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of this analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper builds on previous IAASS work to re-examine many of these theoretical assumptions, including the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. This study also employs empirical and theoretical information to test these assumptions, and makes recommendations how to improve the accuracy of these calculations in the future.

  16. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    PubMed Central

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-01-01

    Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806

  17. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    PubMed

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  18. Cosmic equation of state from combined angular diameter distances: Does the tension with luminosity distances exist?

    NASA Astrophysics Data System (ADS)

    Cao, Shuo; Zhu, Zong-Hong

    2014-10-01

    Using relatively complete observational data concerning four angular diameter distance (ADD) measurements and combined SN +GRB observations representing current luminosity distance (LD) data, this paper investigates the compatibility of these two cosmological distances considering three classes of dark energy equation of state (EoS) reconstruction. In particular, we use strongly gravitationally lensed systems from various large systematic gravitational lens surveys and galaxy clusters, which yield the Hubble constant independent ratio between two angular diameter distances Dl s/Ds data. Our results demonstrate that, with more general categories of standard ruler data, ADD and LD data are compatible at 1 σ level. Second, we note that consistency between ADD and LD data is maintained irrespective of the EoS parametrizations: there is a good match between the universally explored Chevalier-Polarski-Linder model and other formulations of cosmic equation of state. Especially for the truncated generalized equation of state (GEoS) model with β =-2 , the conclusions obtained with ADD and LD are almost the same. Finally, statistical analysis of generalized dark energy equation of state performed on four classes of ADD data provides stringent constraints on the EoS parameters w0 , wβ, and β , which suggest that dark energy was a subdominant component at early times. Moreover, the GEoS parametrization with β ≃1 seems to be a more favorable two-parameter model to characterize the cosmic equation of state, because the combined angular diameter distance data (SGL +CBF +BAO +WMAP 9 ) provide the best-fit value β =0.75 1-0.480+0.465 .

  19. Development of Multidisciplinary, Multifidelity Analysis, Integration, and Optimization of Aerospace Vehicles

    DTIC Science & Technology

    2010-02-27

    investigated in more detail. The intermediate level of fidelity, though more expensive, is then used to refine the analysis , add geometric detail, and...design stage is used to further refine the analysis , narrowing the design to a handful of options. Figure 1. Integrated Hierarchical Framework. In...computational structural and computational fluid modeling. For the structural analysis tool we used McIntosh Structural Dynamics’ finite element code CNEVAL

  20. Reassessing the human health benefits from cleaner air.

    PubMed

    Cox, Louis Anthony

    2012-05-01

    Recent proposals to further reduce permitted levels of air pollution emissions are supported by high projected values of resulting public health benefits. For example, the Environmental Protection Agency recently estimated that the 1990 Clean Air Act Amendment (CAAA) will produce human health benefits in 2020, from reduced mortality rates, valued at nearly $2 trillion per year, compared to compliance costs of $65 billion ($0.065 trillion). However, while compliance costs can be measured, health benefits are unproved: they depend on a series of uncertain assumptions. Among these are that additional life expectancy gained by a beneficiary (with median age of about 80 years) should be valued at about $80,000 per month; that there is a 100% probability that a positive, linear, no-threshold, causal relation exists between PM(2.5) concentration and mortality risk; and that progress in medicine and disease prevention will not greatly diminish this relationship. We present an alternative uncertainty analysis that assigns a positive probability of error to each assumption. This discrete uncertainty analysis suggests (with probability >90% under plausible alternative assumptions) that the costs of CAAA exceed its benefits. Thus, instead of suggesting to policymakers that CAAA benefits are almost certainly far larger than its costs, we believe that accuracy requires acknowledging that the costs purchase a relatively uncertain, possibly much smaller, benefit. The difference between these contrasting conclusions is driven by different approaches to uncertainty analysis, that is, excluding or including discrete uncertainties about the main assumptions required for nonzero health benefits to exist at all. © 2011 Society for Risk Analysis.

  1. Attention Deficit Disorder. NICHCY Briefing Paper.

    ERIC Educational Resources Information Center

    Fowler, Mary

    This briefing paper uses a question-and-answer format to provide basic information about children with attention deficit disorder (ADD). Questions address the following concerns: nature and incidence of ADD; causes of ADD; signs of ADD (impulsivity, hyperactivity, disorganization, social skill deficits); the diagnostic ADD assessment; how to get…

  2. Robust Mediation Analysis Based on Median Regression

    PubMed Central

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  3. Meta-Analytic Derivation

    ERIC Educational Resources Information Center

    Snell, Joel C.; Marsh, Mitchell

    2011-01-01

    The authors have over the years tried to revise meta-analysis because it's basic premise is to add apples and oranges together and analyze. In other words, various data on the same subject are chosen using different samples, research strategies, and number properties. The findings are then homogenized and a statistical analysis is used (Snell, J.…

  4. Development and Validation of a Unidimensional Maltreatment Scale in the Add Health Data Set

    ERIC Educational Resources Information Center

    Marszalek, Jacob M.; Hamilton, Jessica L.

    2012-01-01

    Four maltreatment items were examined from Wave III (N = 13,516) of the National Longitudinal Study of Adolescent Health. Item analysis, confirmatory factor analysis, cross-validation, reliability estimates, and convergent validity coefficients strongly supported the validity of using the four items as a unidimensional composite. Implications for…

  5. Deriving Multidimensional Poverty Indicators: Methodological Issues and an Empirical Analysis for Italy

    ERIC Educational Resources Information Center

    Coromaldi, Manuela; Zoli, Mariangela

    2012-01-01

    Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…

  6. Precursors of Young Women's Family Formation Pathways

    ERIC Educational Resources Information Center

    Amato, Paul R.; Landale, Nancy S.; Havasevich-Brooks, Tara C.; Booth, Alan; Eggebeen, David J.; Schoen, Robert; McHale, Susan M.

    2008-01-01

    We used latent class analysis to create family formation pathways for women between the ages of 18 and 23. Input variables included cohabitation, marriage, parenthood, full-time employment, and attending school. Data (n = 2,290) came from Waves I and III of the National Longitudinal Study of Adolescent Health (Add Health). The analysis revealed…

  7. 78 FR 17142 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Correction AGENCY... manufacturing, packing, or holding human food (CGMPs) to modernize it and to add requirements for domestic and... ``food-production purposes (i.e., manufacturing, processing, packing, and holding) to consistently use...

  8. Enterprise Requirements and Acquisition Model (ERAM) Analysis and Extension

    DTIC Science & Technology

    2014-02-20

    add them to the ERAM simulation. References . Arena, M. V., Obaid, Y., Galway L. A., Fox, B., Graser, J. C., Sollinger, J. M., Wu, F., & Wong, C... Galway L. A., Fox, B., Graser, J. C., Sollinger, J. M., Wu, F., & Wong, C. (2006). Impossible certainty: Cost risk analysis for air force systems (MG-415

  9. The aquamet Package for R: A Tool for Use with the National Rivers and Streams Assessment

    EPA Science Inventory

    The use of R software in environmental data analysis has become increasingly common because it is very powerful, versatile and available free of charge, with hundreds of contributed add-on packages available that perform almost every conceivable type of analysis or task. The Envi...

  10. Water Softeners: How Much Sodium Do They Add?

    MedlinePlus

    ... away the saltshaker and cutting back on processed foods. With Sheldon G. Sheps, M.D. Drinking water advisory: Consumer acceptability advice and health effects analysis on sodium. U.S. Environmental Protection Agency. http://www. ...

  11. Models in palaeontological functional analysis

    PubMed Central

    Anderson, Philip S. L.; Bright, Jen A.; Gill, Pamela G.; Palmer, Colin; Rayfield, Emily J.

    2012-01-01

    Models are a principal tool of modern science. By definition, and in practice, models are not literal representations of reality but provide simplifications or substitutes of the events, scenarios or behaviours that are being studied or predicted. All models make assumptions, and palaeontological models in particular require additional assumptions to study unobservable events in deep time. In the case of functional analysis, the degree of missing data associated with reconstructing musculoskeletal anatomy and neuronal control in extinct organisms has, in the eyes of some scientists, rendered detailed functional analysis of fossils intractable. Such a prognosis may indeed be realized if palaeontologists attempt to recreate elaborate biomechanical models based on missing data and loosely justified assumptions. Yet multiple enabling methodologies and techniques now exist: tools for bracketing boundaries of reality; more rigorous consideration of soft tissues and missing data and methods drawing on physical principles that all organisms must adhere to. As with many aspects of science, the utility of such biomechanical models depends on the questions they seek to address, and the accuracy and validity of the models themselves. PMID:21865242

  12. Common neural structures activated by epidural and transcutaneous lumbar spinal cord stimulation: Elicitation of posterior root-muscle reflexes

    PubMed Central

    Freundl, Brigitta; Binder, Heinrich; Minassian, Karen

    2018-01-01

    Epidural electrical stimulation of the lumbar spinal cord is currently regaining momentum as a neuromodulation intervention in spinal cord injury (SCI) to modify dysregulated sensorimotor functions and augment residual motor capacity. There is ample evidence that it engages spinal circuits through the electrical stimulation of large-to-medium diameter afferent fibers within lumbar and upper sacral posterior roots. Recent pilot studies suggested that the surface electrode-based method of transcutaneous spinal cord stimulation (SCS) may produce similar neuromodulatory effects as caused by epidural SCS. Neurophysiological and computer modeling studies proposed that this noninvasive technique stimulates posterior-root fibers as well, likely activating similar input structures to the spinal cord as epidural stimulation. Here, we add a yet missing piece of evidence substantiating this assumption. We conducted in-depth analyses and direct comparisons of the electromyographic (EMG) characteristics of short-latency responses in multiple leg muscles to both stimulation techniques derived from ten individuals with SCI each. Post-activation depression of responses evoked by paired pulses applied either epidurally or transcutaneously confirmed the reflex nature of the responses. The muscle responses to both techniques had the same latencies, EMG peak-to-peak amplitudes, and waveforms, except for smaller responses with shorter onset latencies in the triceps surae muscle group and shorter offsets of the responses in the biceps femoris muscle during epidural stimulation. Responses obtained in three subjects tested with both methods at different time points had near-identical waveforms per muscle group as well as same onset latencies. The present results strongly corroborate the activation of common neural input structures to the lumbar spinal cord—predominantly primary afferent fibers within multiple posterior roots—by both techniques and add to unraveling the basic mechanisms underlying electrical SCS. PMID:29381748

  13. The lead time tradeoff: the case of health states better than dead.

    PubMed

    Pinto-Prades, José Luis; Rodríguez-Míguez, Eva

    2015-04-01

    Lead time tradeoff (L-TTO) is a variant of the time tradeoff (TTO). L-TTO introduces a lead period in full health before illness onset, avoiding the need to use 2 different procedures for states better and worse than dead. To estimate utilities, additive separability is assumed. We tested to what extent violations of this assumption can bias utilities estimated with L-TTO. A sample of 500 members of the Spanish general population evaluated 24 health states, using face-to-face interviews. A total of 188 subjects were interviewed with L-TTO and the rest with TTO. Both samples evaluated the same set of 24 health states, divided into 4 groups with 6 health states per set. Each subject evaluated 1 of the sets. A random effects regression model was fitted to our data. Only health states better than dead were included in the regression since it is in this subset where additive separability can be tested clearly. Utilities were higher in L-TTO in relation to TTO (on average L-TTO adds about 0.2 points to the utility of health states), suggesting that additive separability is violated. The difference between methods increased with the severity of the health state. Thus, L-TTO adds about 0.14 points to the average utility of the less severe states, 0.23 to the intermediate states, and 0.28 points to the more severe estates. L-TTO produced higher utilities than TTO. Health problems are perceived as less severe if a lead period in full health is added upfront, implying that there are interactions between disjointed time periods. The advantages of this method have to be compared with the cost of modeling the interaction between periods. © The Author(s) 2014.

  14. N-Acetylcysteine in the Treatment of Pediatric Trichotillomania: A Randomized, Double-Blind, Placebo-Controlled Add-On Trial

    PubMed Central

    Bloch, Michael H.; Panza, Kaitlyn E.; Grant, Jon E.; Pittenger, Christopher; Leckman, James F.

    2013-01-01

    Objective To examine the efficacy of N-acetylcysteine (NAC) for the treatment of pediatric trichotillomania (TTM) in a double-blind, placebo-controlled, add-on study. Method A total of 39 children and adolescents aged 8 to 17 years with pediatric trichotillomania were randomly assigned to receive NAC or matching placebo for 12 weeks. Our primary outcome was change in severity of hairpulling as measured by the Massachusetts General Hospital–Hairpulling Scale (MGH-HPS). Secondary measures assessed hairpulling severity, automatic versus focused pulling, clinician-rated improvement, and comorbid anxiety and depression. Outcomes were examined using linear mixed models to test the treatment × time interaction in an intention-to-treat population. Results No significant difference between N-acetylcysteine and placebo was found on any of the primary or secondary outcome measures. On several measures of hairpulling, subjects significantly improved with time regardless of treatment assignment. In the NAC group, 25% of subjects were judged as treatment responders, compared to 21% in the placebo group. Conclusions We observed no benefit of NAC for the treatment of children with trichotillomania. Our findings stand in contrast to a previous, similarly designed trial in adults with TTM, which demonstrated a very large, statistically significant benefit of NAC. Based on the differing results of NAC in pediatric and adult TTM populations, the assumption that pharmacological interventions demonstrated to be effective in adults with TTM will be as effective in children, may be inaccurate. This trial highlights the importance of referring children with TTM to appropriate behavioral therapy before initiating pharmacological interventions, as behavioral therapy has demonstrated efficacy in both children and adults with trichotillomania. PMID:23452680

  15. N-Acetylcysteine in the treatment of pediatric trichotillomania: a randomized, double-blind, placebo-controlled add-on trial.

    PubMed

    Bloch, Michael H; Panza, Kaitlyn E; Grant, Jon E; Pittenger, Christopher; Leckman, James F

    2013-03-01

    To examine the efficacy of N-acetylcysteine (NAC) for the treatment of pediatric trichotillomania (TTM) in a double-blind, placebo-controlled, add-on study. A total of 39 children and adolescents aged 8 to 17 years with pediatric trichotillomania were randomly assigned to receive NAC or matching placebo for 12 weeks. Our primary outcome was change in severity of hairpulling as measured by the Massachusetts General Hospital-Hairpulling Scale (MGH-HPS). Secondary measures assessed hairpulling severity, automatic versus focused pulling, clinician-rated improvement, and comorbid anxiety and depression. Outcomes were examined using linear mixed models to test the treatment×time interaction in an intention-to-treat population. No significant difference between N-acetylcysteine and placebo was found on any of the primary or secondary outcome measures. On several measures of hairpulling, subjects significantly improved with time regardless of treatment assignment. In the NAC group, 25% of subjects were judged as treatment responders, compared to 21% in the placebo group. We observed no benefit of NAC for the treatment of children with trichotillomania. Our findings stand in contrast to a previous, similarly designed trial in adults with TTM, which demonstrated a very large, statistically significant benefit of NAC. Based on the differing results of NAC in pediatric and adult TTM populations, the assumption that pharmacological interventions demonstrated to be effective in adults with TTM will be as effective in children, may be inaccurate. This trial highlights the importance of referring children with TTM to appropriate behavioral therapy before initiating pharmacological interventions, as behavioral therapy has demonstrated efficacy in both children and adults with trichotillomania. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  16. State transition model: vorapaxar added to standard antiplatelet therapy to prevent thrombosis post myocardial infarction or peripheral artery disease.

    PubMed

    Du, Mark; Chase, Monica; Oguz, Mustafa; Davies, Glenn

    2017-09-01

    To evaluate long-term health benefits and risks of adding vorapaxar (VOR) to the standard care antiplatelet therapy (SC) of aspirin and/or clopidogrel, among a population with a recent myocardial infarction (MI) and/or peripheral artery disease (PAD). In a state-transition model, patients transition between health states (event-free, recurrent MI, stroke, death), while at risk of experiencing non-transition-related revascularization and non-fatal bleeding events. Risk equations developed from the TRA 2°P-TIMI 50 trial's patient-level data were used to predict cardiovascular (CV) outcomes over longer time horizons. Additional sources, including trials and US-based observational studies, informed the inputs for short-term CV risk, non-CV death, and health-related quality of life. Survival and quality-adjusted life-years (QALYs) were estimated over a lifetime horizon, discounted at 3% per year. Within a cohort of 7361 patients with recent MI and/or PAD, VOR + SC relative to SC alone yielded 176 fewer CV events (MIs, strokes, or CV deaths), but 27 more major bleeding events. VOR + SC was associated with increased life expectancy and health benefits (19.93 undiscounted life-years [LYs], 9.57 discounted QALYs vs. 19.61 undiscounted LYs, 9.41 discounted QALYs). The results were most sensitive to scenarios varying time of vorapaxar initiation, and the assumptions in the 90 day period post-MI. Additional analyses showed that add-on vorapaxar provides consistent incremental benefits in high-risk subgroups. This study contributes to the growing literature on secondary prevention add-on therapy, as results from these modeling analyses suggest that adding vorapaxar to SC for patients at high atherothrombotic risk can provide long-term health benefits.

  17. What pediatricians should know about normal language development: ensuring cultural differences are not diagnosed as disorders.

    PubMed

    Weiss, Amy L; Van Haren, Melissa S

    2003-07-01

    The roles and responsibilities of speech-language pathologists and pediatricians have become greater with the changing population demographics in the United States. In some states, the majority of the population belongs to a national cultural minority, eg, New Mexico. Even a state such as Iowa, with only a 5% nonmajority population, has a school-aged population that is almost 10% nonmajority. This growth of diversity is likely to continue. Rather than viewing sensitivity to the influence of culture on language learning and other developmental areas as an "add-on" to a practice, it may be wiser to recognize that approaching all clients with as few assumptions about their behaviors as possible will guarantee nonbiased service delivery for all. Without nonbiased service delivery, incorrect diagnoses and provision of inappropriate therapy become more likely. Fortunately, many resources are available to assist pediatricians and speech-language pathologists in learning about various cultures. Institutional review boards have become more vigilant about the inclusion of a cross-section of subject populations as participants in research studies in addition to protecting the rights of all participants. Funding agencies also have expressed as a priority the inclusion of research subjects from minority populations to add to the information available about the incidence and prevalence of disorders across the range of our potential patients. In a society in which cultural differences are not just defined by race or ethnicity, but by gender, sexual orientation, age, geographic region, and religion, belief systems about disease, disability, and treatment are dynamic entities for health professionals to take into consideration. It is a challenge that speech-language pathologists and pediatricians must meet if they are to provide the best and most appropriate services for their patients.

  18. Clinical evaluation of a combination therapy of imepitoin with phenobarbital in dogs with refractory idiopathic epilepsy.

    PubMed

    Neßler, Jasmin; Rundfeldt, Chris; Löscher, Wolfgang; Kostic, Draginja; Keefe, Thomas; Tipold, Andrea

    2017-01-25

    Imepitoin was tested as a combination treatment with phenobarbital in an open-label mono-centre cohort study in dogs with drug-resistant epilepsy. Diagnosis of idiopathic epilepsy was based on clinical findings, magnetic resonance imaging and cerebrospinal fluid analysis. Three cohorts were treated. In cohort A, dogs not responding to phenobarbital with or without established add-on treatment of potassium bromide or levetiracetam were treated add-on with imepitoin, starting at 10 mg/kg BID, with titration allowed to 30 mg/kg BID. In cohort B, the only difference to cohort A was that the starting dose of imepitoin was reduced to 5 mg/kg BID. In cohort C, animals not responding to imepitoin at >20 mg/kg BID were treated with phenobarbital add-on starting at 0.5 mg/kg BID. The add-on treatment resulted in a reduction in monthly seizure frequency (MSF) in all three cohorts. A reduction of ≥50% was obtained in 36-42% of all animals, without significant difference between cohorts. The lower starting dose of 5 mg/kg BID imepitoin was better tolerated, and an up-titration to on average of 15 mg/kg BID was sufficient in cohort A and B. In cohort C, a mean add-on dose of 1.5 mg/kg BID phenobarbital was sufficient to achieve a clinically meaningful effect. Six dogs developed a clinically meaningful increase in MSF of ≥ 50%, mostly in cohort A. Neither imepitoin nor phenobarbital add-on treatment was capable of suppressing cluster seizure activity, making cluster seizure activity an important predictor for drug-resistance. A combination treatment of imepitoin and phenobarbital is a useful treatment option for a subpopulation of dogs with drug-resistant epilepsy, a low starting dose with 5 mg/kg BID is recommended.

  19. The relationship of document and quantitative literacy with learning styles and selected personal variables for aerospace technology students at Indiana State University

    NASA Astrophysics Data System (ADS)

    Martin, Royce Ann

    The purpose of this study was to determine the extent that student scores on a researcher-constructed quantitative and document literacy test, the Aviation Documents Delineator (ADD), were associated with (a) learning styles (imaginative, analytic, common sense, dynamic, and undetermined), as identified by the Learning Type Measure, (b) program curriculum (aerospace administration, professional pilot, both aerospace administration and professional pilot, other, or undeclared), (c) overall cumulative grade point average at Indiana State University, and (d) year in school (freshman, sophomore, junior, or senior). The Aviation Documents Delineator (ADD) was a three-part, 35 question survey that required students to interpret graphs, tables, and maps. Tasks assessed in the ADD included (a) locating, interpreting, and describing specific data displayed in the document, (b) determining data for a specified point on the table through interpolation, (c) comparing data for a string of variables representing one aspect of aircraft performance to another string of variables representing a different aspect of aircraft performance, (d) interpreting the documents to make decisions regarding emergency situations, and (e) performing single and/or sequential mathematical operations on a specified set of data. The Learning Type Measure (LTM) was a 15 item self-report survey developed by Bernice McCarthy (1995) to profile an individual's processing and perception tendencies in order to reveal different individual approaches to learning. The sample used in this study included 143 students enrolled in Aerospace Technology Department courses at Indiana State University in the fall of 1996. The ADD and the LTM were administered to each subject. Data collected in this investigation were analyzed using a stepwise multiple regression analysis technique. Results of the study revealed that the variables, year in school and GPA, were significant predictors of the criterion variables, document, quantitative, and total literacy, when utilizing the ADD. The variables learning style and program of study were found not to be significant predictors of literacy scores on the ADD instrument.

  20. Information retrieval from holographic interferograms: Fundamentals and problems

    NASA Technical Reports Server (NTRS)

    Vest, Charles M.

    1987-01-01

    Holographic interferograms can contain large amounts of information about flow and temperature fields. Their information content can be very high because they can be viewed from many different directions. This multidirectionality, and fringe localization add to the information contained in the fringe pattern if diffuse illumination is used. Additional information, and increased accuracy can be obtained through the use of dual reference wave holography to add reference fringes or to effect discrete phase shift or hetrodyne interferometry. Automated analysis of fringes is possible if interferograms are of simple structure and good quality. However, in practice a large number of practical problems can arise, so that a difficult image processing task results.

  1. Microscopic observation drug-susceptibility assay vs. Xpert® MTB/RIF for the diagnosis of tuberculosis in a rural African setting: a cost-utility analysis.

    PubMed

    Wikman-Jorgensen, Philip E; Llenas-García, Jara; Pérez-Porcuna, Tomàs M; Hobbins, Michael; Ehmer, Jochen; Mussa, Manuel A; Ascaso, Carlos

    2017-06-01

    To compare the cost-utility of microscopic observation drug-susceptibility assay (MODS) and Xpert ® MTB/RIF implementation for tuberculosis (TB) diagnosis in rural northern Mozambique. Stochastic transmission compartmental TB model from the healthcare provider perspective with parameter input from direct measurements, systematic literature reviews and expert opinion. MODS and Xpert ® MTB/RIF were evaluated as replacement test of smear microscopy (SM) or as an add-on test after a negative SM. Costs were calculated in 2013 USD, effects in disability-adjusted life years (DALY). Willingness to pay threshold (WPT) was established at once the per capita Gross National Income of Mozambique. MODS as an add-on test to negative SM produced an incremental cost-effectiveness ratio (ICER) of 5647.89USD/DALY averted. MODS as a substitute for SM yielded an ICER of 5374.58USD/DALY averted. Xpert ® MTB/RIF as an add-on test to negative SM yielded ICER of 345.71USD/DALY averted. Xpert ® MTB/RIF as a substitute for SM obtained an ICER of 122.13USD/DALY averted. TB prevalence and risk of infection were the main factors impacting MODS and Xpert ® MTB/RIF ICER in the one-way sensitivity analysis. In the probabilistic sensitivity analysis, Xpert ® MTB/RIF was most likely to have an ICER below the WPT, whereas MODS was not. Our cost-utility analysis favours the implementation of Xpert ® MTB/RIF as a replacement of SM for all TB suspects in this rural high TB/HIV prevalence African setting. © 2017 John Wiley & Sons Ltd.

  2. A statistical analysis of the dependency of closure assumptions in cumulus parameterization on the horizontal resolution

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    1994-01-01

    Simulated data from the UCLA cumulus ensemble model are used to investigate the quasi-universal validity of closure assumptions used in existing cumulus parameterizations. A closure assumption is quasi-universally valid if it is sensitive neither to convective cloud regimes nor to horizontal resolutions of large-scale/mesoscale models. The dependency of three types of closure assumptions, as classified by Arakawa and Chen, on the horizontal resolution is addressed in this study. Type I is the constraint on the coupling of the time tendencies of large-scale temperature and water vapor mixing ratio. Type II is the constraint on the coupling of cumulus heating and cumulus drying. Type III is a direct constraint on the intensity of a cumulus ensemble. The macroscopic behavior of simulated cumulus convection is first compared with the observed behavior in view of Type I and Type II closure assumptions using 'quick-look' and canonical correlation analyses. It is found that they are statistically similar to each other. The three types of closure assumptions are further examined with simulated data averaged over selected subdomain sizes ranging from 64 to 512 km. It is found that the dependency of Type I and Type II closure assumptions on the horizontal resolution is very weak and that Type III closure assumption is somewhat dependent upon the horizontal resolution. The influences of convective and mesoscale processes on the closure assumptions are also addressed by comparing the structures of canonical components with the corresponding vertical profiles in the convective and stratiform regions of cumulus ensembles analyzed directly from simulated data. The implication of these results for cumulus parameterization is discussed.

  3. PKSolver: An add-in program for pharmacokinetic and pharmacodynamic data analysis in Microsoft Excel.

    PubMed

    Zhang, Yong; Huo, Meirong; Zhou, Jianping; Xie, Shaofei

    2010-09-01

    This study presents PKSolver, a freely available menu-driven add-in program for Microsoft Excel written in Visual Basic for Applications (VBA), for solving basic problems in pharmacokinetic (PK) and pharmacodynamic (PD) data analysis. The program provides a range of modules for PK and PD analysis including noncompartmental analysis (NCA), compartmental analysis (CA), and pharmacodynamic modeling. Two special built-in modules, multiple absorption sites (MAS) and enterohepatic circulation (EHC), were developed for fitting the double-peak concentration-time profile based on the classical one-compartment model. In addition, twenty frequently used pharmacokinetic functions were encoded as a macro and can be directly accessed in an Excel spreadsheet. To evaluate the program, a detailed comparison of modeling PK data using PKSolver and professional PK/PD software package WinNonlin and Scientist was performed. The results showed that the parameters estimated with PKSolver were satisfactory. In conclusion, the PKSolver simplified the PK and PD data analysis process and its output could be generated in Microsoft Word in the form of an integrated report. The program provides pharmacokinetic researchers with a fast and easy-to-use tool for routine and basic PK and PD data analysis with a more user-friendly interface. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  4. A Hop-Count Analysis Scheme for Avoiding Wormhole Attacks in MANET

    PubMed Central

    Jen, Shang-Ming; Laih, Chi-Sung; Kuo, Wen-Chung

    2009-01-01

    MANET, due to the nature of wireless transmission, has more security issues compared to wired environments. A specific type of attack, the Wormhole attack does not require exploiting any nodes in the network and can interfere with the route establishment process. Instead of detecting wormholes from the role of administrators as in previous methods, we implement a new protocol, MHA, using a hop-count analysis from the viewpoint of users without any special environment assumptions. We also discuss previous works which require the role of administrator and their reliance on impractical assumptions, thus showing the advantages of MHA. PMID:22408566

  5. An assessment of the impact of FIA's default assumptions on the estimates of coarse woody debris volume and biomass

    Treesearch

    Vicente J. Monleon

    2009-01-01

    Currently, Forest Inventory and Analysis estimation procedures use Smalian's formula to compute coarse woody debris (CWD) volume and assume that logs lie horizontally on the ground. In this paper, the impact of those assumptions on volume and biomass estimates is assessed using 7 years of Oregon's Phase 2 data. Estimates of log volume computed using Smalian...

  6. Development of an Analysis Method to Identify the Root Causes of Finding from the Air Force Environmental Compliance Assessment and Management Program (ECAMP)

    DTIC Science & Technology

    1994-09-01

    Theories and Applications ..................... 17 Theories of Motivation ........................... 18 Maslow’s Hierarchy of Needs...18 Herzberg’s Motivation-Hygiene Theory ............. 19 Instrinsic vs Extrinsic Assumptions ................... 22 McGregor’s Theory X and Theory ...Y Assumptions ...... ... 22 Vroom’s Expectancy Theory ..................... 24 Applications ........ .......................... 25 Tell People What

  7. The Effect of Multicollinearity and the Violation of the Assumption of Normality on the Testing of Hypotheses in Regression Analysis.

    ERIC Educational Resources Information Center

    Vasu, Ellen S.; Elmore, Patricia B.

    The effects of the violation of the assumption of normality coupled with the condition of multicollinearity upon the outcome of testing the hypothesis Beta equals zero in the two-predictor regression equation is investigated. A monte carlo approach was utilized in which three differenct distributions were sampled for two sample sizes over…

  8. Spacelab scrubber analysis and test support

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Contaminants to be used in qualification and development tests of the add-on charcoal bed scrubber were established, along with rates and methods for their introduction. The contaminant levels to be achieved were predicted and test results were analyzed.

  9. The Internship Report.

    ERIC Educational Resources Information Center

    Corey, Jim; Killingsworth, M. Jimmie

    1987-01-01

    Recommends a four-part structure for retrospective internship reports: (1) introduction, (2) narrative, (3) analysis and evaluation, and (4) appendix. Advises teachers to present the report form to the student before the internship begins to add structure to the internship experience. (SKC)

  10. Differential Decomposition Among Pig, Rabbit, and Human Remains.

    PubMed

    Dautartas, Angela; Kenyhercz, Michael W; Vidoli, Giovanna M; Meadows Jantz, Lee; Mundorff, Amy; Steadman, Dawnie Wolfe

    2018-03-30

    While nonhuman animal remains are often utilized in forensic research to develop methods to estimate the postmortem interval, systematic studies that directly validate animals as proxies for human decomposition are lacking. The current project compared decomposition rates among pigs, rabbits, and humans at the University of Tennessee's Anthropology Research Facility across three seasonal trials that spanned nearly 2 years. The Total Body Score (TBS) method was applied to quantify decomposition changes and calculate the postmortem interval (PMI) in accumulated degree days (ADD). Decomposition trajectories were analyzed by comparing the estimated and actual ADD for each seasonal trial and by fuzzy cluster analysis. The cluster analysis demonstrated that the rabbits formed one group while pigs and humans, although more similar to each other than either to rabbits, still showed important differences in decomposition patterns. The decomposition trends show that neither nonhuman model captured the pattern, rate, and variability of human decomposition. © 2018 American Academy of Forensic Sciences.

  11. A Comparison of Satellite Conjunction Analysis Screening Tools

    DTIC Science & Technology

    2011-09-01

    visualization tool. Version 13.1.4 for Linux was tested. The SOAP conjunction analysis function does not have the capacity to perform the large...was examined by SOAP to confirm the conjunction. STK Advanced CAT STK Advanced CAT (Conjunction Analysis Tools) is an add-on module for the STK ...run with each tool. When attempting to perform the seven day all vs all analysis with STK Advanced CAT, the program consistently crashed during report

  12. Renewable Fuels Legislation Impact Analysis

    EIA Publications

    2005-01-01

    An analysis based on an extension of the ethanol supply curve in our model to allow for enough ethanol production to meet the requirements of S. 650. This analysis provides an update of the May 23, 2005 analysis, with revised ethanol production and cost assumptions.

  13. Linking Developmental Themes to Theories in the Autobiographical Narratives of Life-Span Development Students

    ERIC Educational Resources Information Center

    Mayo, Joseph A.

    2017-01-01

    Prior research findings point to the efficacy of using autobiographical life-story narration as a learning tool in undergraduate classes. The current study seeks to add to the existing literature on this topic by performing a qualitative analysis across events recorded in students' autobiographical narratives. The purpose of this analysis is to…

  14. Participation Structures as a Mediational Means: Learning Balinese Gamelan in the United States through Intent Participation, Mediated Discourse, and Distributed Cognition

    ERIC Educational Resources Information Center

    Jocuns, Andrew

    2009-01-01

    Participation has presented a complex unit of analysis for interactional sociolinguistics. In this study I add another dimension to participation by considering recent theories related to sociocultural activity theory--mediated discourse analysis and distributed cognition. Drawing on examples from "maguru panggul", the traditional…

  15. Lifelong Education and Lifelong Learning with Chinese Characteristics: A Critical Policy Discourse Analysis

    ERIC Educational Resources Information Center

    Shan, Hongxia

    2017-01-01

    Researchers in China have keenly explored how lifelong education and lifelong learning, as imports from "the West," may become localized in China, although a small chorus has also tried to revitalize Confucianism to bear on the field. This paper adds to this domain of discussion with a critical discourse analysis of Chinese lifelong…

  16. The Impact of Guided Notes on Post-Secondary Student Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2013-01-01

    The common practice of using of guided notes in the post-secondary classroom is not fully appreciated or understood. In an effort to add to the existing research about this phenomenon, the current investigation expands on previously published research and one previously published meta-analysis that examined the impact of guided notes on…

  17. Identify temporal trend of air temperature and its impact on forest stream flow in Lower Mississippi River Alluvial Valley using wavelet analysis

    USDA-ARS?s Scientific Manuscript database

    Characterization of stream flow is essential to water resource management, water supply planning, environmental protection, and ecological restoration; while climate change can exacerbate stream flow and add instability to the flow. In this study, the wavelet analysis technique was employed to asse...

  18. Thermal Protection Supplement for Reducing Interface Thermal Mismatch

    NASA Technical Reports Server (NTRS)

    Stewart, David A. (Inventor); Leiser, Daniel B. (Inventor)

    2017-01-01

    A thermal protection system that reduces a mismatch of thermal expansion coefficients CTE between a first material layer (CTE1) and a second material layer (CTE2) at a first layer-second layer interface. A portion of aluminum borosilicate (abs) or another suitable additive (add), whose CTE value, CTE(add), satisfies (CTE(add)-CTE1)(CTE(add)-CTE2)<0, is distributed with variable additive density,.rho.(z;add), in the first material layer and/or in the second material layer, with.rho.(z;add) near the materials interface being relatively high (alternatively, relatively low) and.rho.(z;add) in a region spaced apart from the interface being relatively low (alternatively, relatively high).

  19. 5 CFR 841.411 - Appeals procedure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... agency's actuarial analysis are sufficient and reliable (As a general rule, at least 5 years of data... reliable.); (2) The assumptions used in the agency's actuarial analysis are justified; (3) When all...

  20. Flexible modeling improves assessment of prognostic value of C-reactive protein in advanced non-small cell lung cancer.

    PubMed

    Gagnon, B; Abrahamowicz, M; Xiao, Y; Beauchamp, M-E; MacDonald, N; Kasymjanova, G; Kreisman, H; Small, D

    2010-03-30

    C-reactive protein (CRP) is gaining credibility as a prognostic factor in different cancers. Cox's proportional hazard (PH) model is usually used to assess prognostic factors. However, this model imposes a priori assumptions, which are rarely tested, that (1) the hazard ratio associated with each prognostic factor remains constant across the follow-up (PH assumption) and (2) the relationship between a continuous predictor and the logarithm of the mortality hazard is linear (linearity assumption). We tested these two assumptions of the Cox's PH model for CRP, using a flexible statistical model, while adjusting for other known prognostic factors, in a cohort of 269 patients newly diagnosed with non-small cell lung cancer (NSCLC). In the Cox's PH model, high CRP increased the risk of death (HR=1.11 per each doubling of CRP value, 95% CI: 1.03-1.20, P=0.008). However, both the PH assumption (P=0.033) and the linearity assumption (P=0.015) were rejected for CRP, measured at the initiation of chemotherapy, which kept its prognostic value for approximately 18 months. Our analysis shows that flexible modeling provides new insights regarding the value of CRP as a prognostic factor in NSCLC and that Cox's PH model underestimates early risks associated with high CRP.

  1. A structured framework for assessing sensitivity to missing data assumptions in longitudinal clinical trials.

    PubMed

    Mallinckrodt, C H; Lin, Q; Molenberghs, M

    2013-01-01

    The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Regulatory role of glycogen synthase kinase 3 for transcriptional activity of ADD1/SREBP1c.

    PubMed

    Kim, Kang Ho; Song, Min Jeong; Yoo, Eung Jae; Choe, Sung Sik; Park, Sang Dai; Kim, Jae Bum

    2004-12-10

    Adipocyte determination- and differentiation-dependent factor 1 (ADD1) plays important roles in lipid metabolism and insulin-dependent gene expression. Because insulin stimulates carbohydrate and lipid synthesis, it would be important to decipher how the transcriptional activity of ADD1/SREBP1c is regulated in the insulin signaling pathway. In this study, we demonstrated that glycogen synthase kinase (GSK)-3 negatively regulates the transcriptional activity of ADD1/SREBP1c. GSK3 inhibitors enhanced a transcriptional activity of ADD1/SREBP1c and expression of ADD1/SREBP1c target genes including fatty acid synthase (FAS), acetyl-CoA carboxylase 1 (ACC1), and steroyl-CoA desaturase 1 (SCD1) in adipocytes and hepatocytes. In contrast, overexpression of GSK3beta down-regulated the transcriptional activity of ADD1/SREBP1c. GSK3 inhibitor-mediated ADD1/SREBP1c target gene activation did not require de novo protein synthesis, implying that GSK3 might affect transcriptional activity of ADD1/SREBP1c at the level of post-translational modification. Additionally, we demonstrated that GSK3 efficiently phosphorylated ADD1/SREBP1c in vitro and in vivo. Therefore, these data suggest that GSK3 inactivation is crucial to confer stimulated transcriptional activity of ADD1/SREBP1c for insulin-dependent gene expression, which would coordinate lipid and glucose metabolism.

  3. Doubling Geothermal Generation Capacity by 2020. A Strategic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Anna; Young, Katherine

    2016-01-01

    This report identifies the potential of U.S. geothermal resource and the current market to add an additional 3 GW of geothermal by 2020, in order to meet the goal set forth in the Climate Action Plan.

  4. Psychometric properties and norms of the German ABC-Community and PAS-ADD Checklist.

    PubMed

    Zeilinger, Elisabeth L; Weber, Germain; Haveman, Meindert J

    2011-01-01

    The aim of the present study was to standardize and generate psychometric evidence of the German language versions of two well-established English language mental health instruments: the Aberrant Behavior Checklist-Community (ABC-C) and the Psychiatric Assessment Schedule for Adults with Developmental Disabilities (PAS-ADD) Checklist. New methods in this field were introduced: a simulation method for testing the factor structure and an exploration of long-term stability over two years. The checklists were both administered to a representative sample of 270 individuals with intellectual disability (ID) and, two years later in a second data collection, to 128 participants of the original sample. Principal component analysis and parallel analysis were performed. Reliability measures, long-term stability, subscale intercorrelations, as well as standardized norms were generated. Prevalence of mental health problems was examined. Psychometric properties were mostly excellent, with long-term stability showing moderate to strong effects. The original factor structure of the ABC-C was replicated. PAS-ADD Checklist produced a similar, but still different structure compared with findings from the English language area. The overall prevalence rate of mental health problems in the sample was about 20%. Considering the good results on the measured psychometric properties, the two checklists are recommended for the early detection of mental health problems in persons with ID. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Emerging spectra of singular correlation matrices under small power-map deformations

    NASA Astrophysics Data System (ADS)

    Vinayak; Schäfer, Rudi; Seligman, Thomas H.

    2013-09-01

    Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.

  6. Emerging spectra of singular correlation matrices under small power-map deformations.

    PubMed

    Vinayak; Schäfer, Rudi; Seligman, Thomas H

    2013-09-01

    Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.

  7. Response to Wu et al. - Cost-effectiveness analysis of infant pneumococcal vaccination in Malaysia and Hong Kong.

    PubMed

    Varghese, Lijoy; Mungall, Bruce; Zhang, Xu-Hao; Hoet, Bernard

    2016-10-02

    A recently published paper that assessed the comparative cost-effectiveness of the 2 pneumococcal conjugate vaccines (PCVs) in Malaysia and Hong Kong reported that the 13-valent PCV vaccine (PCV13) is a better choice compared to the 10-valent pneumococcal non-typeable Haemophilus influenzae protein D conjugate vaccine (PHiD-CV or PCV10) from both a payer and societal perspective as well as under various scenarios. However, the analysis relied on a large number of assumptions that were either erroneous or did not take into account the most recent body of evidence available. A rigorous evaluation of the underlying assumptions is necessary to present a fair and balanced analysis for decision-making.

  8. A Test of Major Assumptions about Behavior Change: A Comprehensive Look at the Effects of Passive and Active HIV-Prevention Interventions Since the Beginning of the Epidemic

    ERIC Educational Resources Information Center

    Albarracin, Dolores; Gillette, Jeffrey C.; Earl, Allison N.; Glasman, Laura R.; Durantini, Marta R.; Ho, Moon-Ho

    2005-01-01

    This meta-analysis tested the major theoretical assumptions about behavior change by examining the outcomes and mediating mechanisms of different preventive strategies in a sample of 354 HIV-prevention interventions and 99 control groups, spanning the past 17 years. There were 2 main conclusions from this extensive review. First, the most…

  9. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    PubMed Central

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  10. Assessing Omitted Confounder Bias in Multilevel Mediation Models.

    PubMed

    Tofighi, Davood; Kelley, Ken

    2016-01-01

    To draw valid inference about an indirect effect in a mediation model, there must be no omitted confounders. No omitted confounders means that there are no common causes of hypothesized causal relationships. When the no-omitted-confounder assumption is violated, inference about indirect effects can be severely biased and the results potentially misleading. Despite the increasing attention to address confounder bias in single-level mediation, this topic has received little attention in the growing area of multilevel mediation analysis. A formidable challenge is that the no-omitted-confounder assumption is untestable. To address this challenge, we first analytically examined the biasing effects of potential violations of this critical assumption in a two-level mediation model with random intercepts and slopes, in which all the variables are measured at Level 1. Our analytic results show that omitting a Level 1 confounder can yield misleading results about key quantities of interest, such as Level 1 and Level 2 indirect effects. Second, we proposed a sensitivity analysis technique to assess the extent to which potential violation of the no-omitted-confounder assumption might invalidate or alter the conclusions about the indirect effects observed. We illustrated the methods using an empirical study and provided computer code so that researchers can implement the methods discussed.

  11. Fair lineups are better than biased lineups and showups, but not because they increase underlying discriminability.

    PubMed

    Smith, Andrew M; Wells, Gary L; Lindsay, R C L; Penrod, Steven D

    2017-04-01

    Receiver Operating Characteristic (ROC) analysis has recently come in vogue for assessing the underlying discriminability and the applied utility of lineup procedures. Two primary assumptions underlie recommendations that ROC analysis be used to assess the applied utility of lineup procedures: (a) ROC analysis of lineups measures underlying discriminability, and (b) the procedure that produces superior underlying discriminability produces superior applied utility. These same assumptions underlie a recently derived diagnostic-feature detection theory, a theory of discriminability, intended to explain recent patterns observed in ROC comparisons of lineups. We demonstrate, however, that these assumptions are incorrect when ROC analysis is applied to lineups. We also demonstrate that a structural phenomenon of lineups, differential filler siphoning, and not the psychological phenomenon of diagnostic-feature detection, explains why lineups are superior to showups and why fair lineups are superior to biased lineups. In the process of our proofs, we show that computational simulations have assumed, unrealistically, that all witnesses share exactly the same decision criteria. When criterial variance is included in computational models, differential filler siphoning emerges. The result proves dissociation between ROC curves and underlying discriminability: Higher ROC curves for lineups than for showups and for fair than for biased lineups despite no increase in underlying discriminability. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. Missing data in trial‐based cost‐effectiveness analysis: An incomplete journey

    PubMed Central

    Gomes, Manuel; Carpenter, James R.

    2018-01-01

    SUMMARY Cost‐effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial‐based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty‐two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost‐effectiveness data was 63% (interquartile range: 47%–81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing‐at‐random assumption. Further improvements are needed to address missing data in cost‐effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing‐at‐random assumption. PMID:29573044

  13. Pirfenidone in patients with rapidly progressive interstitial lung disease associated with clinically amyopathic dermatomyositis

    NASA Astrophysics Data System (ADS)

    Li, Ting; Guo, Li; Chen, Zhiwei; Gu, Liyang; Sun, Fangfang; Tan, Xiaoming; Chen, Sheng; Wang, Xiaodong; Ye, Shuang

    2016-09-01

    To evaluate the efficacy of pirfenidone in patients with rapidly progressive interstitial lung disease (RPILD) related to clinically amyopathic dermatomyositis (CADM), we conducted an open-label, prospective study with matched retrospective controls. Thirty patients diagnosed with CADM-RPILD with a disease duration <6 months at Renji Hospital South Campus from June 2014 to November 2015 were prospectively enrolled and treated with pirfenidone at a target dose of 1800 mg/d in addition to conventional treatment, such as a glucocorticoid and/or other immunosuppressants. Matched patients without pirfenidone treatment (n = 27) were retrospectively selected as controls between October 2012 and September 2015. We found that the pirfenidone add-on group displayed a trend of lower mortality compared with the control group (36.7% vs 51.9%, p = 0.2226). Furthermore, the subgroup analysis indicated that the pirfenidone add-on had no impact on the survival of acute ILD patients (disease duration <3 months) (50% vs 50%, p = 0.3862) while for subacute ILD patients (disease duration 3-6 months), the pirfenidone add-on (n = 10) had a significantly higher survival rate compared with the control subgroup (n = 9) (90% vs 44.4%, p = 0.0450). Our data indicated that the pirfenidone add-on may improve the prognosis of patients with subacute ILD related to CADM.

  14. Pirfenidone in patients with rapidly progressive interstitial lung disease associated with clinically amyopathic dermatomyositis.

    PubMed

    Li, Ting; Guo, Li; Chen, Zhiwei; Gu, Liyang; Sun, Fangfang; Tan, Xiaoming; Chen, Sheng; Wang, Xiaodong; Ye, Shuang

    2016-09-12

    To evaluate the efficacy of pirfenidone in patients with rapidly progressive interstitial lung disease (RPILD) related to clinically amyopathic dermatomyositis (CADM), we conducted an open-label, prospective study with matched retrospective controls. Thirty patients diagnosed with CADM-RPILD with a disease duration <6 months at Renji Hospital South Campus from June 2014 to November 2015 were prospectively enrolled and treated with pirfenidone at a target dose of 1800 mg/d in addition to conventional treatment, such as a glucocorticoid and/or other immunosuppressants. Matched patients without pirfenidone treatment (n = 27) were retrospectively selected as controls between October 2012 and September 2015. We found that the pirfenidone add-on group displayed a trend of lower mortality compared with the control group (36.7% vs 51.9%, p = 0.2226). Furthermore, the subgroup analysis indicated that the pirfenidone add-on had no impact on the survival of acute ILD patients (disease duration <3 months) (50% vs 50%, p = 0.3862); while for subacute ILD patients (disease duration 3-6 months), the pirfenidone add-on (n = 10) had a significantly higher survival rate compared with the control subgroup (n = 9) (90% vs 44.4%, p = 0.0450). Our data indicated that the pirfenidone add-on may improve the prognosis of patients with subacute ILD related to CADM.

  15. Development of an add-on kit for scanning confocal microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Guo, Kaikai; Zheng, Guoan

    2017-03-01

    Scanning confocal microscopy is a standard choice for many fluorescence imaging applications in basic biomedical research. It is able to produce optically sectioned images and provide acquisition versatility to address many samples and application demands. However, scanning a focused point across the specimen limits the speed of image acquisition. As a result, scanning confocal microscope only works well with stationary samples. Researchers have performed parallel confocal scanning using digital-micromirror-device (DMD), which was used to project a scanning multi-point pattern across the sample. The DMD based parallel confocal systems increase the imaging speed while maintaining the optical sectioning ability. In this paper, we report the development of an add-on kit for high-speed and low-cost confocal microscopy. By adapting this add-on kit to an existing regular microscope, one can convert it into a confocal microscope without significant hardware modifications. Compared with current DMD-based implementations, the reported approach is able to recover multiple layers along the z axis simultaneously. It may find applications in wafer inspection and 3D metrology of semiconductor circuit. The dissemination of the proposed add-on kit under $1000 budget could also lead to new types of experimental designs for biological research labs, e.g., cytology analysis in cell culture experiments, genetic studies on multicellular organisms, pharmaceutical drug profiling, RNA interference studies, investigation of microbial communities in environmental systems, and etc.

  16. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.« less

  17. Graphical tools for network meta-analysis in STATA.

    PubMed

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  18. Graphical Tools for Network Meta-Analysis in STATA

    PubMed Central

    Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

  19. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  20. Leaky vaccines protect highly exposed recipients at a lower rate: implications for vaccine efficacy estimation and sieve analysis.

    PubMed

    Edlefsen, Paul T

    2014-01-01

    "Leaky" vaccines are those for which vaccine-induced protection reduces infection rates on a per-exposure basis, as opposed to "all-or-none" vaccines, which reduce infection rates to zero for some fraction of subjects, independent of the number of exposures. Leaky vaccines therefore protect subjects with fewer exposures at a higher effective rate than subjects with more exposures. This simple observation has serious implications for analysis methodologies that rely on the assumption that the vaccine effect is homogeneous across subjects. We argue and show through examples that this heterogeneous vaccine effect leads to a violation of the proportional hazards assumption, to incomparability of infected cases across treatment groups, and to nonindependence of the distributions of the competing failure processes in a competing risks setting. We discuss implications for vaccine efficacy estimation, correlates of protection analysis, and mark-specific efficacy analysis (also known as sieve analysis).

  1. Design and Analysis of an Electromagnetic Thrust Bearing

    NASA Technical Reports Server (NTRS)

    Banerjee, Bibhuti; Rao, Dantam K.

    1996-01-01

    A double-acting electromagnetic thrust bearing is normally used to counter the axial loads in many rotating machines that employ magnetic bearings. It essentially consists of an actuator and drive electronics. Existing thrust bearing design programs are based on several assumptions. These assumptions, however, are often violated in practice. For example, no distinction is made between maximum external loads and maximum bearing forces, which are assumed to be identical. Furthermore, it is assumed that the maximum flux density in the air gap occurs at the nominal gap position of the thrust runner. The purpose of this paper is to present a clear theoretical basis for the design of the electromagnetic thrust bearing which obviates such assumptions.

  2. Adaptive windowing and windowless approaches to estimate dynamic functional brain connectivity

    NASA Astrophysics Data System (ADS)

    Yaesoubi, Maziar; Calhoun, Vince D.

    2017-08-01

    In this work, we discuss estimation of dynamic dependence of a multi-variate signal. Commonly used approaches are often based on a locality assumption (e.g. sliding-window) which can miss spontaneous changes due to blurring with local but unrelated changes. We discuss recent approaches to overcome this limitation including 1) a wavelet-space approach, essentially adapting the window to the underlying frequency content and 2) a sparse signal-representation which removes any locality assumption. The latter is especially useful when there is no prior knowledge of the validity of such assumption as in brain-analysis. Results on several large resting-fMRI data sets highlight the potential of these approaches.

  3. Speed-of-light limitations in passive linear media

    NASA Astrophysics Data System (ADS)

    Welters, Aaron; Avniel, Yehuda; Johnson, Steven G.

    2014-08-01

    We prove that well-known speed-of-light restrictions on electromagnetic energy velocity can be extended to a new level of generality, encompassing even nonlocal chiral media in periodic geometries, while at the same time weakening the underlying assumptions to only passivity and linearity of the medium (either with a transparency window or with dissipation). As was also shown by other authors under more limiting assumptions, passivity alone is sufficient to guarantee causality and positivity of the energy density (with no thermodynamic assumptions). Our proof is general enough to include a very broad range of material properties, including anisotropy, bianisotropy (chirality), nonlocality, dispersion, periodicity, and even delta functions or similar generalized functions. We also show that the "dynamical energy density" used by some previous authors in dissipative media reduces to the standard Brillouin formula for dispersive energy density in a transparency window. The results in this paper are proved by exploiting deep results from linear-response theory, harmonic analysis, and functional analysis that had previously not been brought together in the context of electrodynamics.

  4. Strain Profiling of Fatigue Crack Overload Effects Using Energy Dispersive X-Ray Diffraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft,M.; Zhong, Z.; Jisrawi, N.

    In this paper, an assessment of commonly used assumptions associated with {Delta}K{sub eff} and their implications on FCG predictions in light of existing experimental and numerical data is presented. In particular, the following assumptions are examined: (1). {Delta}K{sub eff} fully describes cyclic stresses and strains at the crack-tip vicinity. (2). K{sub op} can be determined experimentally or numerically with certain accuracy. (3). Overload alters K{sub op} but not K{sub max} and associated s{sub max} at the crack-tip 'process zone'. (4). Contact of crack faces curtails the crack driving force in terms of {Delta}K{sub eff}. The analysis indicates that there ismore » insufficient support to justify the above assumptions. In contrary, the analysis demonstrates that a two-parameter fatigue crack driving force in terms of {Delta}K and K{sub max}, which accounts for both applied and the internal stresses should be used in FCG analyses and predictions.« less

  5. Analysis and design of second-order sliding-mode algorithms for quadrotor roll and pitch estimation.

    PubMed

    Chang, Jing; Cieslak, Jérôme; Dávila, Jorge; Zolghadri, Ali; Zhou, Jun

    2017-11-01

    The problem addressed in this paper is that of quadrotor roll and pitch estimation without any assumption about the knowledge of perturbation bounds when Inertial Measurement Units (IMU) data or position measurements are available. A Smooth Sliding Mode (SSM) algorithm is first designed to provide reliable estimation under a smooth disturbance assumption. This assumption is next relaxed with the second proposed Adaptive Sliding Mode (ASM) algorithm that deals with disturbances of unknown bounds. In addition, the analysis of the observers are extended to the case where measurements are corrupted by bias and noise. The gains of the proposed algorithms were deduced from the Lyapunov function. Furthermore, some useful guidelines are provided for the selection of the observer turning parameters. The performance of these two approaches is evaluated using a nonlinear simulation model and considering either accelerometer or position measurements. The simulation results demonstrate the benefits of the proposed solutions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Creating and Manipulating a Domain-Specific Formal Object Base to Support a Domain-Oriented Application Composition System

    DTIC Science & Technology

    1992-12-01

    and add new attributes as needed (11:129). 2.2.3.2 Feature Oriented Domain Analysis In their Feature-Oriented Domain Analysis ( FODA ) study, the...dissertation, The University of Texas at Austin, Austin Texas, 1990. 12. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibil- ity Study...2-1 2.2.2 Requirements Languages ..................... 2-2 2.2.3 Domain Analysis ............................ 2-3 2.2.4

  7. A model for heliospheric flux-ropes

    NASA Astrophysics Data System (ADS)

    Nieves-Chinchilla, T.; Linton, M.; Vourlidas, A.; Hidalgo, M. A. U.

    2017-12-01

    This work is presents an analytical flux-rope model, which explores different levels of complexity starting from a circular-cylindrical geometry. The framework of this series of models was established by Nieves-Chinchilla et al. 2016 with the circular-cylindrical analytical flux rope model. The model attempts to describe the magnetic flux rope topology with distorted cross-section as a possible consequence of the interaction with the solar wind. In this model, the flux rope is completely described in a non-orthogonal geometry. The Maxwell equations are solved using tensor calculus consistent with the geometry chosen, invariance along the axial direction, and with the assumption of no radial current density. The model is generalized in terms of the radial and azimuthal dependence of the poloidal current density component and axial current density component. The misalignment between current density and magnetic field is studied in detail for several example profiles of the axial and poloidal current density components. This theoretical analysis provides a map of the force distribution inside of the flux-rope. For reconstruction of the heliospheric flux-ropes, the circular-cylindrical reconstruction technique has been adapted to the new geometry and applied to in situ ICMEs with a flux-rope entrained and tested with cases with clear in situ signatures of distortion. The model adds a piece in the puzzle of the physical-analytical representation of these magnetic structures that should be evaluated with the ultimate goal of reconciling in-situ reconstructions with imaging 3D remote sensing CME reconstructions. Other effects such as axial curvature and/or expansion could be incorporated in the future to fully understand the magnetic structure.

  8. A hidden oncogenic positive feedback loop caused by crosstalk between Wnt and ERK pathways.

    PubMed

    Kim, D; Rath, O; Kolch, W; Cho, K-H

    2007-07-05

    The Wnt and the extracellular signal regulated-kinase (ERK) pathways are both involved in the pathogenesis of various kinds of cancers. Recently, the existence of crosstalk between Wnt and ERK pathways was reported. Gathering all reported results, we have discovered a positive feedback loop embedded in the crosstalk between the Wnt and ERK pathways. We have developed a plausible model that represents the role of this hidden positive feedback loop in the Wnt/ERK pathway crosstalk based on the integration of experimental reports and employing established basic mathematical models of each pathway. Our analysis shows that the positive feedback loop can generate bistability in both the Wnt and ERK signaling pathways, and this prediction was further validated by experiments. In particular, using the commonly accepted assumption that mutations in signaling proteins contribute to cancerogenesis, we have found two conditions through which mutations could evoke an irreversible response leading to a sustained activation of both pathways. One condition is enhanced production of beta-catenin, the other is a reduction of the velocity of MAP kinase phosphatase(s). This enables that high activities of Wnt and ERK pathways are maintained even without a persistent extracellular signal. Thus, our study adds a novel aspect to the molecular mechanisms of carcinogenesis by showing that mutational changes in individual proteins can cause fundamental functional changes well beyond the pathway they function in by a positive feedback loop embedded in crosstalk. Thus, crosstalk between signaling pathways provides a vehicle through which mutations of individual components can affect properties of the system at a larger scale.

  9. Construct validity of the Swedish version of the revised piper fatigue scale in an oncology sample--a Rasch analysis.

    PubMed

    Lundgren-Nilsson, Asa; Dencker, Anna; Jakobsson, Sofie; Taft, Charles; Tennant, Alan

    2014-06-01

    Fatigue is a common and distressing symptom in cancer patients due to both the disease and its treatments. The concept of fatigue is multidimensional and includes both physical and mental components. The 22-item Revised Piper Fatigue Scale (RPFS) is a multidimensional instrument developed to assess cancer-related fatigue. This study reports on the construct validity of the Swedish version of the RPFS from the perspective of Rasch measurement. The Swedish version of the RPFS was answered by 196 cancer patients fatigued after 4 to 5 weeks of curative radiation therapy. Data from the scale were fitted to the Rasch measurement model. This involved testing a series of assumptions, including the stochastic ordering of items, local response dependency, and unidimensionality. A series of fit statistics were computed, differential item functioning (DIF) was tested, and local response dependency was accommodated through testlets. The Behavioral, Affective and Sensory domains all satisfied the Rasch model expectations. No DIF was observed, and all domains were found to be unidimensional. The Mood/Cognitive scale failed to fit the model, and substantial multidimensionality was found. Splitting the scale between Mood and Cognitive items resolved fit to the Rasch model, and new domains were unidimensional without DIF. The current Rasch analyses add to the evidence of measurement properties of the scale and show that the RPFS has good psychometric properties and works well to measure fatigue. The original four-factor structure, however, was not supported. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Why would we use the Sediment Isotope Tomography (SIT) model to establish a 210Pb-based chronology in recent-sediment cores?

    PubMed

    Abril Hernández, José-María

    2015-05-01

    After half a century, the use of unsupported (210)Pb ((210)Pbexc) is still far off from being a well established dating tool for recent sediments with widespread applicability. Recent results from the statistical analysis of time series of fluxes, mass sediment accumulation rates (SAR), and initial activities, derived from varved sediments, place serious constraints to the assumption of constant fluxes, which is widely used in dating models. The Sediment Isotope Tomography (SIT) model, under the assumption of non post-depositional redistribution, is used for dating recent sediments in scenarios in that fluxes and SAR are uncorrelated and both vary with time. By using a simple graphical analysis, this paper shows that under the above assumptions, any given (210)Pbexc profile, even with the restriction of a discrete set of reference points, is compatible with an infinite number of chronological lines, and thus generating an infinite number of mathematically exact solutions for histories of initial activity concentrations, SAR and fluxes onto the SWI, with these two last ranging from zero up to infinity. Particularly, SIT results, without additional assumptions, cannot contain any statistically significant difference with respect to the exact solutions consisting in intervals of constant SAR or constant fluxes (both being consistent with the reference points). Therefore, there is not any benefit in its use as a dating tool without the explicit introduction of additional restrictive assumptions about fluxes, SAR and/or their interrelationship. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Where Are the Logical Errors in the Theory of Big Bang?

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2015-04-01

    The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.

  12. Analysis and evaluation in the production process and equipment area of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Wolf, M.; Goldman, H.

    1981-01-01

    The attributes of the various metallization processes were investigated. It is shown that several metallization process sequences will lead to adequate metallization for large area, high performance solar cells at a metallization add on price in the range of $6. to 12. m squared, or 4 to $.8/W(peak), assuming 15% efficiency. Conduction layer formation by thick film silver or by tin or tin/lead solder leads to metallization add-on prices significantly above the $6. to 12/m squared range c.) The wet chemical processes of electroless and electrolytic plating for strike/barrier layer and conduction layer formation, respectively, seem to be most cost effective.

  13. Sensitivity of wildlife habitat models to uncertainties in GIS data

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  14. Modelling lecturer performance index of private university in Tulungagung by using survival analysis with multivariate adaptive regression spline

    NASA Astrophysics Data System (ADS)

    Hasyim, M.; Prastyo, D. D.

    2018-03-01

    Survival analysis performs relationship between independent variables and survival time as dependent variable. In fact, not all survival data can be recorded completely by any reasons. In such situation, the data is called censored data. Moreover, several model for survival analysis requires assumptions. One of the approaches in survival analysis is nonparametric that gives more relax assumption. In this research, the nonparametric approach that is employed is Multivariate Regression Adaptive Spline (MARS). This study is aimed to measure the performance of private university’s lecturer. The survival time in this study is duration needed by lecturer to obtain their professional certificate. The results show that research activities is a significant factor along with developing courses material, good publication in international or national journal, and activities in research collaboration.

  15. Clarifying Objectives and Results of Equivalent System Mass Analyses for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Drysdale, Alan E.

    2003-01-01

    This paper discusses some of the analytical decisions that an investigator must make during the course of a life support system trade study. Equivalent System Mass (ESM) is often applied to evaluate trade study options in the Advanced Life Support (ALS) Program. ESM can be used to identify which of several options that meet all requirements are most likely to have lowest cost. It can also be used to identify which of the many interacting parts of a life support system have the greatest impact and sensitivity to assumptions. This paper summarizes recommendations made in the newly developed ALS ESM Guidelines Document and expands on some of the issues relating to trade studies that involve ESM. In particular, the following three points are expounded: 1) The importance of objectives: Analysis objectives drive the approach to any trade study, including identification of assumptions, selection of characteristics to compare in the analysis, and the most appropriate techniques for reflecting those characteristics. 2) The importance of results inferprefafion: The accuracy desired in the results depends upon the analysis objectives, whereas the realized accuracy is determined by the data quality and degree of detail in analysis methods. 3) The importance of analysis documentation: Documentation of assumptions and data modifications is critical for effective peer evaluation of any trade study. ESM results are analysis-specific and should always be reported in context, rather than as solitary values. For this reason, results reporting should be done with adequate rigor to allow for verification by other researchers.

  16. The ssWavelets package

    Treesearch

    Jeffrey H. Gove

    2017-01-01

    This package adds several classes, generics and associated methods as well as a few various functions to help with wavelet decomposition of sampling surfaces generated using sampSurf. As such, it can be thought of as an extension to sampSurf for wavelet analysis.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinilla, Maria Isabel

    This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.

  18. Numerical analysis on the cutting and finishing efficiency of MRAFF process

    NASA Astrophysics Data System (ADS)

    Lih, F. L.

    2016-03-01

    The aim of the present research is to conduct a numerical study of the characteristic of a two-phase magnetorheological fluid with different operation conditions by the finite volume method called SIMPLE with an add-on MHD code.

  19. Analysis of Strategies for Multiple Emissions from Electric Power SO2, NOX, CO2, Mercury and RPS

    EIA Publications

    2001-01-01

    At the request of the Subcommittee, the Energy Information Administration prepared an initial report that focused on the impacts of reducing power sector NOx, SO2, and CO2 emissions. The current report extends the earlier analysis to add the impacts of reducing power sector mercury emissions and introducing renewable portfolio standard (RPS) requirements.

  20. Course-Shopping in the Urban Community Colleges: An Analysis of Student Drop and Add Activities.

    ERIC Educational Resources Information Center

    Hagedorn, Linda Serra; Maxwell, William B.; Cypers, Scott; Moon, Hye Sun; Lester, Jaime

    This study examines the course shopping behaviors of approximately 5,000 community college students enrolled across the nine campuses of the Los Angeles Community College District in spring 2001. The sample students are representative of the district. For the purpose of this analysis, the authors define course shopping as: (1) cyclic shopping, the…

  1. Rasch Analysis of the Locus-of-Hope Scale. Brief Report

    ERIC Educational Resources Information Center

    Gadiana, Leny G.; David, Adonis P.

    2015-01-01

    The Locus-of-Hope Scale (LHS) was developed as a measure of the locus-of-hope dimensions (Bernardo, 2010). The present study adds to the emerging literature on locus-of-hope by assessing the psychometric properties of the LHS using Rasch analysis. The results from the Rasch analyses of the four subscales of LHS provided evidence on the…

  2. Distribution and progression of add power among people in need of near correction.

    PubMed

    Han, Xiaotong; Lee, Pei Ying; Liu, Chi; He, Mingguang

    2018-04-16

    This study helps to better understand the need and trend in presbyopic add power in the aging society. Distribution and progression of presbyopic add power in East Asian population is largely unknown. Prospective cohort study. About 303 participants from a population-based study of residents aged 35 years and older in Guangzhou, China. Visual acuity (VA) test and non-cycloplegic automated refraction were performed at baseline in 2008 and the 6-year follow-up per standardized protocol. Participants with presenting near VA ≤ 20/40 underwent distance subjective refraction and add power measurement by increasing plus lens at a standard distance of 40 cm at each visit. Add power at baseline and follow-ups. Mean (standard deviation) age of the study participants was 57.6 (11.1) years and 50.2% were female. The mean add power at baseline was 1.43, 1.73, 2.03 and 2.20 diopters (D) for individuals in the age groups of 35-44, 45-54, 55-64 and 65+ years, respectively. Participants with older age and lower educational level had significantly higher add power requirements (P < 0.001). The overall 6-year increase in add power was 0.15D (95% CI: 0.06 to 0.25), and was smaller in myopic subjects (P = 0.03). Baseline age and add power, but not changes in biometric factors, were associated with longitudinal change in add power (P < 0.001). Distribution and progression of add power in Chinese was different from that previously suggested by Caucasian studies. More studies are needed to establish up-to-date age-related add power prescription norms for population of different ethnicities. © 2018 Royal Australian and New Zealand College of Ophthalmologists.

  3. Metabolomic profiling of doxycycline treatment in chronic obstructive pulmonary disease.

    PubMed

    Singh, Brajesh; Jana, Saikat K; Ghosh, Nilanjana; Das, Soumen K; Joshi, Mamata; Bhattacharyya, Parthasarathi; Chaudhury, Koel

    2017-01-05

    Serum metabolic profiling can identify the metabolites responsible for discrimination between doxycycline treated and untreated chronic obstructive pulmonary disease (COPD) and explain the possible effect of doxycycline in improving the disease conditions. 1 H nuclear magnetic resonance (NMR)-based metabolomics was used to obtain serum metabolic profiles of 60 add-on doxycycline treated COPD patients and 40 patients receiving standard therapy. The acquired data were analyzed using multivariate principal component analysis (PCA), partial least-squares-discriminant analysis (PLS-DA), and orthogonal projection to latent structure with discriminant analysis (OPLS-DA). A clear metabolic differentiation was apparent between the pre and post doxycycline treated group. The distinguishing metabolites lactate and fatty acids were significantly down-regulated and formate, citrate, imidazole and l-arginine upregulated. Lactate and folate are further validated biochemically. Metabolic changes, such as decreased lactate level, inhibited arginase activity and lowered fatty acid level observed in COPD patients in response to add-on doxycycline treatment, reflect the anti-inflammatory action of the drug. Doxycycline as a possible therapeutic option for COPD seems promising. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Who gains clinical benefit from using insulin pump therapy? A qualitative study of the perceptions and views of health professionals involved in the Relative Effectiveness of Pumps over MDI and Structured Education (REPOSE) trial.

    PubMed

    Lawton, J; Kirkham, J; Rankin, D; White, D A; Elliott, J; Jaap, A; Smithson, W H; Heller, S

    2016-02-01

    To explore health professionals' views about insulin pump therapy [continuous subcutaneous insulin infusion (CSII)] and the types of individuals they thought would gain greatest clinical benefit from using this treatment. In-depth interviews with staff (n = 18) who delivered the Relative Effectiveness of Pumps Over MDI and Structured Education (REPOSE) trial. Data were analysed thematically. Staff perceived insulin pumps as offering a better self-management tool to some individuals due to the drip feed of insulin, the ability to alter basal rates and other advanced features. However, staff also noted that, because of the diversity of features on offer, CSII is a more technically complex therapy to execute than multiple daily injections. For this reason, staff described how, alongside clinical criteria, they had tended to select individuals for CSII in routine clinical practice based on their perceptions about whether they possessed the personal and psychological attributes needed to make optimal use of pump technology. Staff also described how their assumptions about personal and psychological suitability had been challenged by working on the REPOSE trial and observing individuals make effective use of CSII who they would not have recommended for this type of therapy in routine clinical practice. Our findings add to those studies that highlight the difficulties of using patient characteristics and variables to predict clinical success using CSII. To promote equitable access to CSII, attitudinal barriers and prejudicial assumptions amongst staff about who is able to make effective use of CSII may need to be addressed. © 2015 The Authors. Diabetic Medicine © 2015 Diabetes UK.

  5. Mediation, identification, and plausibility: an illustration using children's mental health services.

    PubMed

    Foster, E Michael

    2014-10-01

    Analyses of mediation are important for understanding the effects of mental health services and treatments. The most common approach is to add potential mediators as regressors and to estimate the direct and indirect effects of the treatment of interest. This practice makes the strong assumption that the mediator itself does not suffer from unobserved confounding--that it is as if randomly assigned. In many instances, this assumption seems rather implausible. The objective of this article is to describe the identification problem that represents the fundamental challenge of causal inference. It outlines how mediation complicates identification and considers several identification strategies. The goal of this article is not to propose a new method for handling mediation or to identify a best method for doing so. The latter, in fact, is impossible. The contribution of the article is to illustrate how one can think about possible approaches to mediation in the context of a specific empirical study. Using data from a large evaluation of a demonstration project in children's mental health services (n = 763), the article illustrates identification strategies. That demonstration improved service delivery in several ways but primarily by offering services "intermediate" between inpatient and outpatient. These analyses focus on the impact of these intermediate services on 6-month improvement in a behavior checklist commonly used to measure psychopathology and competence among children and youths. The results highlight how different identification strategies produce different answers to key questions. These alternative findings have to be assessed in light of substantive knowledge of the program involved. The analyses generally support the notion that children and youths treated at the demonstration site who received intermediate services benefited from them. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  6. Soft Tissue Deformations Contribute to the Mechanics of Walking in Obese Adults

    PubMed Central

    Fu, Xiao-Yu; Zelik, Karl E.; Board, Wayne J.; Browning, Raymond C.; Kuo, Arthur D.

    2014-01-01

    Obesity not only adds to the mass that must be carried during walking, but also changes body composition. Although extra mass causes roughly proportional increases in musculoskeletal loading, less well understood is the effect of relatively soft and mechanically compliant adipose tissue. Purpose To estimate the work performed by soft tissue deformations during walking. The soft tissue would be expected to experience damped oscillations, particularly from high force transients following heel strike, and could potentially change the mechanical work demands for walking. Method We analyzed treadmill walking data at 1.25 m/s for 11 obese (BMI > 30 kg/m2) and 9 non-obese (BMI < 30 kg/m2) adults. The soft tissue work was quantified with a method that compares the work performed by lower extremity joints as derived using assumptions of rigid body segments, with that estimated without rigid body assumptions. Results Relative to body mass, obese and non-obese individuals perform similar amounts of mechanical work. But negative work performed by soft tissues was significantly greater in obese individuals (p= 0.0102), equivalent to about 0.36 J/kg vs. 0.27 J/kg in non-obese individuals. The negative (dissipative) work by soft tissues occurred mainly after heel strike, and for obese individuals was comparable in magnitude to the total negative work from all of the joints combined (0.34 J/kg vs. 0.33 J/kg for obese and non-obese adults, respectively). Although the joints performed a relatively similar amount of work overall, obese individuals performed less negative work actively at the knee. Conclusion The greater proportion of soft tissues in obese individuals results in substantial changes in the amount, location, and timing of work, and may also impact metabolic energy expenditure during walking. PMID:25380475

  7. The Effects of Positive Versus Negative Mood States on Attentional Processes During Exposure to Erotica.

    PubMed

    Carvalho, Joana; Pereira, Raquel; Barreto, Diana; Nobre, Pedro J

    2017-11-01

    The relationship between emotions and sexual functioning has been documented since early sex research. Among other effects, emotions are expected to impact sexual response by shaping individuals' attention to sexual cues; yet, this assumption has not been tested. This study aimed to investigate whether attentional processes to sexual cues are impacted by state emotions, and whether the processes impacted by emotions relate to subjective sexual arousal to a sex film clip. A total of 52 men and 73 women were randomly assigned to one of three experimental conditions: (1) a negative mood induction condition (sadness as dominant emotion), (2) a positive mood induction condition (amusement as dominant emotion), and a (3) neutral/control condition. After mood induction, participants were exposed to a sex film clip while their focus of visual attention was measured using an eye tracker. Three areas of interest (AOI) were considered within the sex clip: background (non-sexual cues), body interaction, and genital interaction. Self-reported attention, thoughts during the sex clip, percent dwell time, and pupil size to AOI were considered as attentional markers. Findings revealed that the attentional processes were not impacted by the mood conditions. Instead, gender effects were found. While men increased their visual attention to the background area of the film clip, women increased attention to the genital area. Also, sexual arousal thoughts during exposure to the sex clip were consistently related to subjective sexual arousal regardless of the momentary emotional state. Findings add to the literature by showing that men and women process the sexual components of a stimulus differently and by challenging the assumption that emotions shape attention to sexual cues.

  8. FMRI group analysis combining effect estimates and their variances

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Nath, Audrey R.; Beauchamp, Michael S.; Cox, Robert W.

    2012-01-01

    Conventional functional magnetic resonance imaging (FMRI) group analysis makes two key assumptions that are not always justified. First, the data from each subject is condensed into a single number per voxel, under the assumption that within-subject variance for the effect of interest is the same across all subjects or is negligible relative to the cross-subject variance. Second, it is assumed that all data values are drawn from the same Gaussian distribution with no outliers. We propose an approach that does not make such strong assumptions, and present a computationally efficient frequentist approach to FMRI group analysis, which we term mixed-effects multilevel analysis (MEMA), that incorporates both the variability across subjects and the precision estimate of each effect of interest from individual subject analyses. On average, the more accurate tests result in higher statistical power, especially when conventional variance assumptions do not hold, or in the presence of outliers. In addition, various heterogeneity measures are available with MEMA that may assist the investigator in further improving the modeling. Our method allows group effect t-tests and comparisons among conditions and among groups. In addition, it has the capability to incorporate subject-specific covariates such as age, IQ, or behavioral data. Simulations were performed to illustrate power comparisons and the capability of controlling type I errors among various significance testing methods, and the results indicated that the testing statistic we adopted struck a good balance between power gain and type I error control. Our approach is instantiated in an open-source, freely distributed program that may be used on any dataset stored in the universal neuroimaging file transfer (NIfTI) format. To date, the main impediment for more accurate testing that incorporates both within- and cross-subject variability has been the high computational cost. Our efficient implementation makes this approach practical. We recommend its use in lieu of the less accurate approach in the conventional group analysis. PMID:22245637

  9. An analysis and implications of alternative methods of deriving the density (WPL) terms for eddy covariance flux measurements

    Treesearch

    W. J. Massman; J. -P. Tuovinen

    2006-01-01

    We explore some of the underlying assumptions used to derive the density or WPL terms (Webb et al. (1980) Quart J RoyMeteorol Soc 106:85-100) required for estimating the surface exchange fluxes by eddy covariance. As part of this effort we recast the origin of the density terms as an assumption regarding the density fluctuations rather than as a (dry air) flux...

  10. Common pitfalls in statistical analysis: Linear regression analysis

    PubMed Central

    Aggarwal, Rakesh; Ranganathan, Priya

    2017-01-01

    In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis. PMID:28447022

  11. Missing CD4+ cell response in randomized clinical trials of maraviroc and dolutegravir.

    PubMed

    Cuffe, Robert; Barnett, Carly; Granier, Catherine; Machida, Mitsuaki; Wang, Cunshan; Roger, James

    2015-10-01

    Missing data can compromise inferences from clinical trials, yet the topic has received little attention in the clinical trial community. Shortcomings in commonly used methods used to analyze studies with missing data (complete case, last- or baseline-observation carried forward) have been highlighted in a recent Food and Drug Administration-sponsored report. This report recommends how to mitigate the issues associated with missing data. We present an example of the proposed concepts using data from recent clinical trials. CD4+ cell count data from the previously reported SINGLE and MOTIVATE studies of dolutegravir and maraviroc were analyzed using a variety of statistical methods to explore the impact of missing data. Four methodologies were used: complete case analysis, simple imputation, mixed models for repeated measures, and multiple imputation. We compared the sensitivity of conclusions to the volume of missing data and to the assumptions underpinning each method. Rates of missing data were greater in the MOTIVATE studies (35%-68% premature withdrawal) than in SINGLE (12%-20%). The sensitivity of results to assumptions about missing data was related to volume of missing data. Estimates of treatment differences by various analysis methods ranged across a 61 cells/mm3 window in MOTIVATE and a 22 cells/mm3 window in SINGLE. Where missing data are anticipated, analyses require robust statistical and clinical debate of the necessary but unverifiable underlying statistical assumptions. Multiple imputation makes these assumptions transparent, can accommodate a broad range of scenarios, and is a natural analysis for clinical trials in HIV with missing data.

  12. PESTAN: Pesticide Analytical Model Version 4.0 User's Guide

    EPA Pesticide Factsheets

    The principal objective of this User's Guide to provide essential information on the aspects such as model conceptualization, model theory, assumptions and limitations, determination of input parameters, analysis of results and sensitivity analysis.

  13. The interpersonal core of personality pathology

    PubMed Central

    Hopwood, Christopher J.; Wright, Aidan G.C.; Ansell, Emily B.; Pincus, Aaron L.

    2013-01-01

    The purpose of this paper is to demonstrate that personality pathology is, at its core, fundamentally interpersonal. We review the proposed DSM-5 Section 3 redefinition of personality pathology involving self and interpersonal dysfunction, which we regard as a substantial improvement over the DSM-IV (and DSM-5 Section 2) definition. We note similarities between the proposed scheme and contemporary interpersonal theory and interpret the DSM-5 Section 3 definition using the underlying assumptions and evidence base of the interpersonal paradigm in clinical psychology. We describe how grounding the proposed DSM-5 Section 3 definition in interpersonal theory, and in particular a focus on the “interpersonal situation”, adds to its theoretical texture, empirical support, and clinical utility. We provide a clinical example that demonstrates the ability of contemporary interpersonal theory to augment the DSM-5 definition of personality pathology. We conclude with directions for further research that could clarify the core of personality pathology, and how interpersonal theory can inform research aimed at enhancing the DSM-5 Section 3 proposal and ultimately justify its migration to DSM-5 Section 2. PMID:23735037

  14. Is Knowledge of Physical Reality Still Kantian? Some Remarks About the Transcendental Character of Loop Quantum Gravity

    NASA Astrophysics Data System (ADS)

    Laino, Luigi

    2018-06-01

    In the following paper, the author will try to test the meaning of the transcendental approach in respect of the inner changes implied by the idea of quantum gravity. He will firstly describe the basic methodological Kant's aim, viz. the grounding of a meta-science of physics as the a priori corpus of physical knowledge. After that, he will take into account the problematic physical and philosophical relationship between the theory of relativity and the quantum mechanics; in showing how the elementary ontological and epistemological assumptions of experience result to be changed within them, he will also show the further modifications occurred in the development of the loop quantum gravity. He will particularly focus on the tough problem of the relationship space-matter, in order to settle the decisive question about the possibility of keeping a transcendental approach in the light of quantum gravity. He will positively answer by recalling Cassirer's theory of the invariants of experience, although he will also add some problematic issues arising from the new physical context.

  15. Derivation of Einstein-Cartan theory from general relativity

    NASA Astrophysics Data System (ADS)

    Petti, Richard

    2015-04-01

    General relativity cannot describe exchange of classical intrinsic angular momentum and orbital angular momentum. Einstein-Cartan theory fixes this problem in the least invasive way. In the late 20th century, the consensus view was that Einstein-Cartan theory requires inclusion of torsion without adequate justification, it has no empirical support (though it doesn't conflict with any known evidence), it solves no important problem, and it complicates gravitational theory with no compensating benefit. In 1986 the author published a derivation of Einstein-Cartan theory from general relativity, with no additional assumptions or parameters. Starting without torsion, Poincaré symmetry, classical or quantum spin, or spinors, it derives torsion and its relation to spin from a continuum limit of general relativistic solutions. The present work makes the case that this computation, combined with supporting arguments, constitutes a derivation of Einstein-Cartan theory from general relativity, not just a plausibility argument. This paper adds more and simpler explanations, more computational details, correction of a factor of 2, discussion of limitations of the derivation, and discussion of some areas of gravitational research where Einstein-Cartan theory is relevant.

  16. The X-15/HL-20 operations support comparison

    NASA Technical Reports Server (NTRS)

    Morris, W. Douglas

    1993-01-01

    During the 1960's, the United States X-15 rocket-plane research program successfully demonstrated the ability to support a reusable vehicle operating in a near-space environment. The similarity of the proposed HL-20 lifting body concept in general size, weight, and subsystem composition to that of the X-15 provided an opportunity for a comparison of the predicted support manpower and turnaround times with those experienced in the X-15 program. Information was drawn from both reports and discussions with X-15 program personnel to develop comparative operations and support data. Based on the assumption of comparability between the two systems, the predicted staffing levels, skill mix, and refurbishment times of an operational HL-20 appear to be similar to those experienced by the X-15 for ground support. However, safety, environmental, and support requirements have changed such that the HL-20 will face a different operating environment than existed at Edwards during the 1950's and 1960's. Today's operational standards may impose additional requirements on the HL-20 that will add to the maintenance and support burden estimate based on the X-15 analogy.

  17. Do Responses to Different Anthropogenic Forcings Add Linearly in Climate Models?

    NASA Technical Reports Server (NTRS)

    Marvel, Kate; Schmidt, Gavin A.; Shindell, Drew; Bonfils, Celine; LeGrande, Allegra N.; Nazarenko, Larissa; Tsigaridis, Kostas

    2015-01-01

    Many detection and attribution and pattern scaling studies assume that the global climate response to multiple forcings is additive: that the response over the historical period is statistically indistinguishable from the sum of the responses to individual forcings. Here, we use the NASA Goddard Institute for Space Studies (GISS) and National Center for Atmospheric Research Community Climate System Model (CCSM) simulations from the CMIP5 archive to test this assumption for multi-year trends in global-average, annual-average temperature and precipitation at multiple timescales. We find that responses in models forced by pre-computed aerosol and ozone concentrations are generally additive across forcings; however, we demonstrate that there are significant nonlinearities in precipitation responses to di?erent forcings in a configuration of the GISS model that interactively computes these concentrations from precursor emissions. We attribute these to di?erences in ozone forcing arising from interactions between forcing agents. Our results suggest that attribution to specific forcings may be complicated in a model with fully interactive chemistry and may provide motivation for other modeling groups to conduct further single-forcing experiments.

  18. Do responses to different anthropogenic forcings add linearly in climate models?

    DOE PAGES

    Marvel, Kate; Schmidt, Gavin A.; Shindell, Drew; ...

    2015-10-14

    Many detection and attribution and pattern scaling studies assume that the global climate response to multiple forcings is additive: that the response over the historical period is statistically indistinguishable from the sum of the responses to individual forcings. Here, we use the NASA Goddard Institute for Space Studies (GISS) and National Center for Atmospheric Research Community Climate System Model (CCSM4) simulations from the CMIP5 archive to test this assumption for multi-year trends in global-average, annual-average temperature and precipitation at multiple timescales. We find that responses in models forced by pre-computed aerosol and ozone concentrations are generally additive across forcings. However,more » we demonstrate that there are significant nonlinearities in precipitation responses to different forcings in a configuration of the GISS model that interactively computes these concentrations from precursor emissions. We attribute these to differences in ozone forcing arising from interactions between forcing agents. Lastly, our results suggest that attribution to specific forcings may be complicated in a model with fully interactive chemistry and may provide motivation for other modeling groups to conduct further single-forcing experiments.« less

  19. Holograms of a dynamical top quark

    NASA Astrophysics Data System (ADS)

    Clemens, Will; Evans, Nick; Scott, Marc

    2017-09-01

    We present holographic descriptions of dynamical electroweak symmetry breaking models that incorporate the top mass generation mechanism. The models allow computation of the spectrum in the presence of large anomalous dimensions due to walking and strong Nambu-Jona-Lasinio interactions. Technicolor and QCD dynamics are described by the bottom-up Dynamic AdS/QCD model for arbitrary gauge groups and numbers of quark flavors. An assumption about the running of the anomalous dimension of the quark bilinear operator is input, and the model then predicts the spectrum and decay constants for the mesons. We add Nambu-Jona-Lasinio interactions responsible for flavor physics from extended technicolor, top-color, etc., using Witten's multitrace prescription. We show the key behaviors of a top condensation model can be reproduced. We study generation of the top mass in (walking) one doublet and one family technicolor models and with strong extended technicolor interactions. The models clearly reveal the tensions between the large top mass and precision data for δ ρ . The necessary tunings needed to generate a model compatible with precision constraints are simply demonstrated.

  20. A Foraging Cost of Migration for a Partially Migratory Cyprinid Fish

    PubMed Central

    Chapman, Ben B.; Eriksen, Anders; Baktoft, Henrik; Brodersen, Jakob; Nilsson, P. Anders; Hulthen, Kaj; Brönmark, Christer; Hansson, Lars-Anders; Grønkjær, Peter; Skov, Christian

    2013-01-01

    Migration has evolved as a strategy to maximise individual fitness in response to seasonally changing ecological and environmental conditions. However, migration can also incur costs, and quantifying these costs can provide important clues to the ultimate ecological forces that underpin migratory behaviour. A key emerging model to explain migration in many systems posits that migration is driven by seasonal changes to a predation/growth potential (p/g) trade-off that a wide range of animals face. In this study we assess a key assumption of this model for a common cyprinid partial migrant, the roach Rutilus rutilus, which migrates from shallow lakes to streams during winter. By sampling fish from stream and lake habitats in the autumn and spring and measuring their stomach fullness and diet composition, we tested if migrating roach pay a cost of reduced foraging when migrating. Resident fish had fuller stomachs containing more high quality prey items than migrant fish. Hence, we document a feeding cost to migration in roach, which adds additional support for the validity of the p/g model of migration in freshwater systems. PMID:23723967

  1. How much can a large population study on genes, environments, their interactions and common diseases contribute to the health of the American people?

    PubMed

    Chaufan, Claudia

    2007-10-01

    I offer a critical perspective on a large-scale population study on gene-environment interactions and common diseases proposed by the US Secretary of Health and Human Services' Advisory Committee on Genetics, Health, and Society (SACGHS). I argue that for scientific and policy reasons this and similar studies have little to add to current knowledge about how to prevent, treat, or decrease inequalities in common diseases, all of which are major claims of the proposal. I use diabetes as an exemplar of the diseases that the study purports to illuminate. I conclude that the question is not whether the study will meet expectations or whether the current emphasis on a genetic paradigm is real or imagined, desirable or not. Rather, the question is why, given the flaws of the science underwriting the study, its assumptions remain unchallenged. Future research should investigate the reasons for this immunity from criticism and for the popularity of this and similar projects among laypersons as well as among intellectuals.

  2. Effectiveness of vildagliptin versus other oral antidiabetes drugs as add-on to sulphonylurea monotherapy: Post hoc analysis from the EDGE study.

    PubMed

    Prasanna Kumar, K M; Phadke, U; Brath, H; Gawai, A; Paldánius, P M; Mathieu, C

    2016-12-01

    In this post hoc analysis of the EDGE study, we assessed the effectiveness and safety of vildagliptin versus other oral antidiabetes drugs (OADs) as add-on to first-line sulphonylurea (SU) therapy in patients who did not receive metformin in a real-life setting. The primary endpoint was odds of achieving an HbA1c reduction of >0.3% without tolerability issues. Secondary endpoint was odds of achieving HbA1c <7.0% without hypoglycaemia or weight gain. Changes in HbA1c, body weight; and safety were also assessed. 2936 patients received vildagliptin and 820 received comparator OADs (any α-GI, TZD, glinide) as add-on to first-line SU therapy. Overall, the mean age, disease duration, HbA1c, and BMI at baseline were 57.1 years, 6.3 years, 8.5%, and 27.7kg/m 2 , respectively. The odds ratios for achieving primary and secondary endpoints were 1.6 (95% CI: 1.36, 1.86; p<0.0001) and 1.8 (1.45, 2.21; p<0.0001), respectively, in favour of vildagliptin. The between-treatment differences (vildagliptin vs. comparator OAD) for the mean change in HbA1c and body weight were -0.2±0.04% (p<0.0001) and -0.8±0.16kg (p<0.0001), respectively. Overall, the incidence of adverse events was low (vildagliptin, 7% vs. comparator, 8.2%) in both groups. Similar results were observed in a subset of patients enrolled from India and patients who received TZDs as a comparator OAD. Under real-life settings, vildagliptin as add-on to SU monotherapy showed better glycaemic response without tolerability issues compared with other OADs. Copyright © 2016 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.

  3. SwathProfiler and NProfiler: Two new ArcGIS Add-ins for the automatic extraction of swath and normalized river profiles

    NASA Astrophysics Data System (ADS)

    Pérez-Peña, J. V.; Al-Awabdeh, M.; Azañón, J. M.; Galve, J. P.; Booth-Rea, G.; Notti, D.

    2017-07-01

    The present-day great availability of high-resolution Digital Elevation Models has improved tectonic geomorphology analyses in their methodological aspects and geological meaning. Analyses based on topographic profiles are valuable to explore the short and long-term landscape response to tectonic activity and climate changes. Swath and river longitudinal profiles are two of the most used analysis to explore the long and short-term landscape responses. Most of these morphometric analyses are conducted in GIS software, which have become standard tools for analyzing drainage network metrics. In this work we present two ArcGIS Add-Ins to automatically delineate swath and normalized river profiles. Both tools are programmed in Visual Basic . NET and use ArcObjects library-architecture to access directly to vector and raster data. The SwathProfiler Add-In allows analyzing the topography within a swath or band by representing maximum-minimum-mean elevations, first and third quartile, local relief and hypsometry. We have defined a new transverse hypsometric integral index (THi) that analyzes hypsometry along the swath and offer valuable information in these kind of graphics. The NProfiler Add-In allows representing longitudinal normalized river profiles and their related morphometric indexes as normalized concavity (CT), maximum concavity (Cmax) and length of maximum concavity (Lmax). Both tools facilitate the spatial analysis of topography and drainage networks directly in a GIS environment as ArcMap and provide graphical outputs. To illustrate how these tools work, we analyzed two study areas, the Sierra Alhamilla mountain range (Betic Cordillera, SE Spain) and the Eastern margin of the Dead Sea (Jordan). The first study area has been recently studied from a morphotectonic perspective and these new tools can show an added value to the previous studies. The second study area has not been analyzed by quantitative tectonic geomorphology and the results suggest a landscape in transient state due to a continuous base-level fall produced by the formation of the Dead Sea basin.

  4. How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.

    2010-01-01

    In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…

  5. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  6. Generating Atomistic Slab Surfaces with Adsorbates

    DTIC Science & Technology

    2017-12-01

    OTHER DEALINGS IN THE SOFTWARE. from six.moves import range import os import sys import math import copy import numpy as np from...import StructureMatcher from math import acos from mpinterfaces.utils import align_axis, add_vacuum from pymatgen.analysis.structure_matcher import

  7. GDOT Local Beneficiary Analysis of TIA Project Expenditures, Phase II : Impact Evaluation

    DOT National Transportation Integrated Search

    2018-05-01

    In 2012, voters in three regions of GeorgiaCentral Savannah River Area, Heart of Georgia Altamaha, and River Valleyapproved the Transportation Investment Act (TIA) referendum, which added 1% to local sales taxes. Seventy-five percent of the add...

  8. Combining Chemistry and Music to Engage Student Interest: Using Songs to Accompany Selected Chemical Topics

    ERIC Educational Resources Information Center

    Last, Arthur M.

    2009-01-01

    The use of recorded music to add interest to a variety of lecture topics is described. Topics include the periodic table, the formation of ionic compounds, thermodynamics, carbohydrates, nuclear chemistry, and qualitative analysis. (Contains 1 note.)

  9. Analysis of Driver Behavior and Operations at Intersection Short Lanes

    DOT National Transportation Integrated Search

    2016-08-01

    With the ever-increasing demand to add roadway capacity in a safe and efficient manner, the application of auxiliary through lanes (ATLs) at intersections has increased in recent years. ATL intersections exist when there is an added through lane intr...

  10. A Teachable Moment Uncovered by Video Analysis

    NASA Astrophysics Data System (ADS)

    Gates, Joshua

    2011-05-01

    Early in their study of one-dimensional kinematics, my students build an algebraic model that describes the effects of a rolling ball's (perpendicular) collision with a wall. The goal is for the model to predict the ball's velocity when it returns to a fixed point approximately 50-100 cm from the wall as a function of its velocity as it passes this point initially. They are told to assume that the ball's velocity does not change while it rolls to or from the wall—that the velocity change all happens very quickly and only at the wall. In order to evaluate this assumption following the data collection, I have the students analyze one such collision using video analysis. The results uncover an excellent teachable moment about assumptions and their impact on models and error analysis.

  11. Economic analysis of the design and fabrication of a space qualified power system

    NASA Technical Reports Server (NTRS)

    Ruselowski, G.

    1980-01-01

    An economic analysis was performed to determine the cost of the design and fabrication of a low Earth orbit, 2 kW photovoltaic/battery, space qualified power system. A commercially available computer program called PRICE (programmed review of information for costing and evaluation) was used to conduct the analysis. The sensitivity of the various cost factors to the assumptions used is discussed. Total cost of the power system was found to be $2.46 million with the solar array accounting for 70.5%. Using the assumption that the prototype becomes the flight system, 77.3% of the total cost is associated with manufacturing. Results will be used to establish whether the cost of space qualified hardware can be reduced by the incorporation of commercial design, fabrication, and quality assurance methods.

  12. Testing the mean for dependent business data.

    PubMed

    Liang, Jiajuan; Martin, Linda

    2008-01-01

    In business data analysis, it is well known that the comparison of several means is usually carried out by the F-test in analysis of variance under the assumption of independently collected data from all populations. This assumption, however, is likely to be violated in survey data collected from various questionnaires or time-series data. As a result, it is not justifiable or problematic to apply the traditional F-test to comparison of dependent means directly. In this article, we develop a generalized F-test for comparing population means with dependent data. Simulation studies show that the proposed test has a simple approximate null distribution and feasible finite-sample properties. Applications of the proposed test in analysis of survey data and time-series data are illustrated by two real datasets.

  13. Statistical Issues for Calculating Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Bacon, John B.; Matney, Mark

    2016-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering object to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. This information, combined with information on the expected ground path of the reentry, is used to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of this analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper builds on previous IAASS work to re-examine one of these theoretical assumptions.. This study employs empirical and theoretical information to test the assumption of a fully random decay along the argument of latitude of the final orbit, and makes recommendations how to improve the accuracy of this calculation in the future.

  14. 2017 National Household Travel Survey - California Add-On |

    Science.gov Websites

    Transportation Secure Data Center | NREL 7 National Household Travel Survey - California Add-On 2017 National Household Travel Survey - California Add-On The California add-on survey supplements the 2017 National Household Travel Survey (NHTS) with additional household samples and detailed travel

  15. Etiology of Attention Disorders: A Neurological/Genetic Perspective.

    ERIC Educational Resources Information Center

    Grantham, Madeline Kay

    This paper explores the historical origins of attention deficit disorder/attention deficit hyperactivity disorder (ADD/ADHD) as a neurological disorder, current neurological and genetic research concerning the etiology of ADD/ADHD, and implications for diagnosis and treatment. First, ADD/ADHD is defined and then the origins of ADD/ADHD as a…

  16. Flexible modeling improves assessment of prognostic value of C-reactive protein in advanced non-small cell lung cancer

    PubMed Central

    Gagnon, B; Abrahamowicz, M; Xiao, Y; Beauchamp, M-E; MacDonald, N; Kasymjanova, G; Kreisman, H; Small, D

    2010-01-01

    Background: C-reactive protein (CRP) is gaining credibility as a prognostic factor in different cancers. Cox's proportional hazard (PH) model is usually used to assess prognostic factors. However, this model imposes a priori assumptions, which are rarely tested, that (1) the hazard ratio associated with each prognostic factor remains constant across the follow-up (PH assumption) and (2) the relationship between a continuous predictor and the logarithm of the mortality hazard is linear (linearity assumption). Methods: We tested these two assumptions of the Cox's PH model for CRP, using a flexible statistical model, while adjusting for other known prognostic factors, in a cohort of 269 patients newly diagnosed with non-small cell lung cancer (NSCLC). Results: In the Cox's PH model, high CRP increased the risk of death (HR=1.11 per each doubling of CRP value, 95% CI: 1.03–1.20, P=0.008). However, both the PH assumption (P=0.033) and the linearity assumption (P=0.015) were rejected for CRP, measured at the initiation of chemotherapy, which kept its prognostic value for approximately 18 months. Conclusion: Our analysis shows that flexible modeling provides new insights regarding the value of CRP as a prognostic factor in NSCLC and that Cox's PH model underestimates early risks associated with high CRP. PMID:20234363

  17. Desorden Deficitario de la Atencion. Segunda Edicion. NICHCY Briefing Paper [and] El Desorden Deficitario de la Atencion: Una Bibliografia de Materiales en Ingles y Espanol (Attention Deficit Disorder. Second Edition. NICHCY Briefing Paper [and] Attention Deficit Disorder: A Bibliography of Materials in English and Spanish).

    ERIC Educational Resources Information Center

    Fowler, Mary

    This briefing paper uses a question-and-answer format to provide basic information about children with attention deficit disorder (ADD). Questions address the following concerns: nature and incidence of ADD; causes of ADD; signs of ADD (impulsivity, hyperactivity, disorganization, social skill deficits); the diagnostic ADD assessment; how to…

  18. Technical note: Application of the Box-Cox data transformation to animal science experiments.

    PubMed

    Peltier, M R; Wilcox, C J; Sharp, D C

    1998-03-01

    In the use of ANOVA for hypothesis testing in animal science experiments, the assumption of homogeneity of errors often is violated because of scale effects and the nature of the measurements. We demonstrate a method for transforming data so that the assumptions of ANOVA are met (or violated to a lesser degree) and apply it in analysis of data from a physiology experiment. Our study examined whether melatonin implantation would affect progesterone secretion in cycling pony mares. Overall treatment variances were greater in the melatonin-treated group, and several common transformation procedures failed. Application of the Box-Cox transformation algorithm reduced the heterogeneity of error and permitted the assumption of equal variance to be met.

  19. Measuring ambivalence to science

    NASA Astrophysics Data System (ADS)

    Gardner, P. L.

    Ambivalence is a psychological state in which a person holds mixed feelings (positive and negative) towards some psychological object. Standard methods of attitude measurement, such as Likert and semantic differential scales, ignore the possibility of ambivalence; ambivalent responses cannot be distinguished from neutral ones. This neglect arises out of an assumption that positive and negative affects towards a particular psychological object are bipolar, i.e., unidimensional in opposite directions. This assumption is frequently untenable. Conventional item statistics and measures of test internal consistency are ineffective as checks on this assumption; it is possible for a scale to be multidimensional and still display apparent internal consistency. Factor analysis is a more effective procedure. Methods of measuring ambivalence are suggested, and implications for research are discussed.

  20. Efficacy and safety of golimumab in Indian patients with rheumatoid arthritis: Subgroup data from GO-MORE study.

    PubMed

    Pal, Sarvajeet; Veeravalli, Sarath Chandra Mouli; Das, Siddharth Kumar; Shobha, Vineeta; Uppuluri, Ramakrishna Rao; Dharmanand, B G; Nadkar, Milind; Hsia, Elizabeth; Fei, Kaiyin; Yao, Ruji; Khalifa, Ahmed

    2016-11-01

    To conduct a subgroup analysis of GO-MORE trial Part 1, comparing efficacy and safety of add-on subcutaneous golimumab therapy in rheumatoid arthritis (RA) patients enrolled from and outside India. GO-MORE was an open-label, multicenter, prospective trial of add-on golimumab in biologic-naïve RA patients, having active disease despite being on conventional DMARD regimen(s). Part 1 of the study was chosen as the focus of this subgroup analysis because a substantial number of Indian patients (106) were enrolled compared to no Indian patients in Part 2. The primary efficacy outcome was proportion of patients achieving good to moderate DAS28-ESR (Disease Activity Score of 28 joints calculated using erythrocyte sedimentation rate) European League Against Rheumatism (EULAR) response at month 6. Efficacy evaluable population comprised of 105 and 3175 patients from India and outside India, respectively. Safety analysis included 106 patients enrolled from India and 3251 from outside India. A higher proportion of Indian patients had a high disease activity as measured by DAS28 ESR than outside India patients. At month 6, the proportion of Indian and non-Indian patients achieving DAS28-ESR, DAS28 - C-reactive protein, simplified disease activity index (SDAI) remission, and EuroQoL Quality-of-Life Questionnaire (EQ-5D) scores were comparable. Incidence of all adverse events was lower in Indian patients. There were no deaths, cases of tuberculosis or malignancy reported in the patients from India at month 6. The efficacy and safety results with add-on golimumab were consistent between RA patients from India and outside India, despite high baseline disease activity in the Indian patients. © 2016 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  1. Treatment of Early-Age Mania: Outcomes for Partial and Nonresponders to Initial Treatment.

    PubMed

    Walkup, John T; Wagner, Karen Dineen; Miller, Leslie; Yenokyan, Gayane; Luby, Joan L; Joshi, Paramjit T; Axelson, David A; Robb, Adelaide; Salpekar, Jay A; Wolf, Dwight; Sanyal, Abanti; Birmaher, Boris; Vitiello, Benedetto; Riddle, Mark A

    2015-12-01

    The Treatment of Early Age Mania (TEAM) study evaluated lithium, risperidone, and divalproex sodium (divalproex) in children with bipolar I disorder who were naive to antimanic medication, or were partial or nonresponders to 1 of 3 study medications. This report evaluates the benefit of either an add-on or a switch of antimanic medications for an 8-week trial period in partial responders and nonresponders, respectively. TEAM is a randomized, controlled trial of individuals (N = 379) aged 6 to 15 years (mean ± SD = 10.2 ± 2.7 years) with DSM-IV bipolar I disorder (mixed or manic phase). Participants (n = 154) in this report were either nonresponders or partial responders to 1 of the 3 study medications. Nonresponders (n = 89) were randomly assigned to 1 of the other 2 antimanic medications and cross-tapered. Partial responders (n = 65) were randomly assigned to 1 of 2 other antimanic medications as an add-on to their initial medication. Adverse event (AE) rates are reported only for the add-on group. Response rate for children switched to risperidone (47.6%) was higher than for those switched to either lithium (12.8%; p = .005; number needed to treat [NNT] = 3; 95% CI = 1.71-9.09) or divalproex (17.2%; p = .03; NNT = 3; 95% CI = 1.79-20.10); response rate for partial responders who added risperidone (53.3%) was higher than for those who added divalproex (0%; p = .0002; NNT = 2; 95% CI = 1.27-3.56) and trended higher for lithium (26.7%; p = .07; NNT = 4). Reported AEs in the add-on group were largely consistent with the known AE profile for the second medication. Weight gain (kg) was observed for all add-on medications: lithium add-on (n = 29 of 30) = 1.66 ± 1.97; risperidone add-on (n = 15 of 15) = 2.8 ± 1.34; divalproex add-on (n = 19 of 20) = 1.42 ± 1.96. There was no evidence at the 5% significance level that the average weight gain was different by study medication for partial responders (p = .07, 1-way analysis of variance). Risperidone appears to be more useful than lithium or divalproex for children with bipolar I disorder and other comorbid conditions who are nonresponders or partial responders to an initial antimanic medication trial. Clinical trial registration information-Study of Outcome and Safety of Lithium, Divalproex and Risperidone for Mania in Children and Adolescents (TEAM); http://clinicaltrials.gov/; NCT00057681. Copyright © 2015. Published by Elsevier Inc.

  2. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  3. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    PubMed

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  4. An empirical comparison of statistical tests for assessing the proportional hazards assumption of Cox's model.

    PubMed

    Ng'andu, N H

    1997-03-30

    In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.

  5. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  6. An Analysis of AAFES and Its Relevance to the Future of the Army and Air Force

    DTIC Science & Technology

    2009-06-12

    benefits of this organization and are there any viable alternatives? Background and Significance AAFES provides retail goods and services to a select...relative to cost, benefit , and alternative options. Assumptions This study is based on the assumptions that AAFES and the MWR programs of the Army...AAFES is a joint Army and Air Force non-appropriated fund instrumentality (NAFI) charged with operating retail and service activities for the benefit

  7. Iroquois Confederacy’s Experiences with Centerifugal and Centripetal Forces: A Historical Analysis

    DTIC Science & Technology

    2007-06-15

    Charles T. Gehring, and William A. Starna (Syracuse, NY: Syracuse University Press, 1996), 46. 37Pierre Millet , “Letter from Father Millet to Reverend...assumption of a high position involved the French Jesuit, Father Millet . His involvement and knowledge in the Oneida Nation led to his assumption of a...Father Millet , described the intricacies that involved such reciprocity rituals. In 1674 he observed the details of an embassy in Oneida Country

  8. Alternative Fuels Data Center: Seattle Rideshare Fleet Adds EVs, Enjoys

    Science.gov Websites

    Fuels Data Center: Seattle Rideshare Fleet Adds EVs, Enjoys Success on Facebook Tweet about Alternative Fuels Data Center: Seattle Rideshare Fleet Adds EVs, Enjoys Success on Twitter Bookmark Alternative Fuels Data Center: Seattle Rideshare Fleet Adds EVs, Enjoys Success on Google Bookmark Alternative Fuels

  9. The Source for ADD/ADHD: Attention Deficit Disorder and Attention Deficit/Hyperactivity Disorder.

    ERIC Educational Resources Information Center

    Richard, Gail J.; Russell, Joy L.

    This book is intended for professionals who are responsible for designing and implementing educational programs for children with attention deficit disorders and attention deficit/hyperactivity disorder (ADD/ADHD). Chapters address: (1) myths and realities about ADD/ADHD; (2) definitions, disorders associated with ADD/ADHD, and federal educational…

  10. [Isolation of children with attention deficit disorder in their classrooms].

    PubMed

    Mino, Y; Yasuda, N; Ohara, H; Nagamatsu, K; Yoshida, T

    1990-10-01

    The isolation of children with attention deficit disorder (ADD) among peers was examined by conducting sociometric tests in 28 classrooms to which ADD children belonged. The conclusions are as follows: 1. Among ADD children, 39.3% were classified as isolated children who received no Social Choice votes, a proportion that was significantly higher than normal controls. 2. Results suggests that isolation was correlated with such characteristics of ADD children as higher grade in school, later first consultation and lower IQ levels. 3. Classroom characteristics which correlated with isolation of ADD children were larger number of children per class and stronger class cohesion. It is thought that reducing the number of children in each class will be effective in preventing isolation of ADD children. 4. The possibility of an association between assessment of parents for social relationships of ADD children and their isolation exists and needs to be assessed. The authors emphasize the need for ADD children to have support in their interpersonal relationships, and discuss ways of intervention to prevent their being isolated in their classrooms.

  11. Stochastic models of the Social Security trust funds.

    PubMed

    Burdick, Clark; Manchester, Joyce

    Each year in March, the Board of Trustees of the Social Security trust funds reports on the current and projected financial condition of the Social Security programs. Those programs, which pay monthly benefits to retired workers and their families, to the survivors of deceased workers, and to disabled workers and their families, are financed through the Old-Age, Survivors, and Disability Insurance (OASDI) Trust Funds. In their 2003 report, the Trustees present, for the first time, results from a stochastic model of the combined OASDI trust funds. Stochastic modeling is an important new tool for Social Security policy analysis and offers the promise of valuable new insights into the financial status of the OASDI trust funds and the effects of policy changes. The results presented in this article demonstrate that several stochastic models deliver broadly consistent results even though they use very different approaches and assumptions. However, they also show that the variation in trust fund outcomes differs as the approach and assumptions are varied. Which approach and assumptions are best suited for Social Security policy analysis remains an open question. Further research is needed before the promise of stochastic modeling is fully realized. For example, neither parameter uncertainty nor variability in ultimate assumption values is recognized explicitly in the analyses. Despite this caveat, stochastic modeling results are already shedding new light on the range and distribution of trust fund outcomes that might occur in the future.

  12. Experimental analysis of decay biases in the fossil record of lobopodians

    NASA Astrophysics Data System (ADS)

    Murdock, Duncan; Gabbott, Sarah; Purnell, Mark

    2016-04-01

    If fossils are to realize their full potential in reconstructing the tree of life we must understand how our view of ancient organisms is obscured by taphonomic filters of decay and preservation. In most cases, processes of decay will leave behind either nothing or only the most decay resistant body parts, and even in those rare instances where soft tissues are fossilized we cannot assume that the resulting fossil, however exquisite, represents a faithful anatomical representation of the animal as it was in life.Recent experiments have shown that the biases introduced by decay can be far from random; in chordates, for example, the most phylogenetically informative characters are also the most decay-prone, resulting in 'stemward slippage'. But how widespread is this phenomenon, and are there other non-random biases linked to decay? Intuitively, we make assumptions about the likelihood of different kinds of characters to survive and be preserved, with knock-on effects for anatomical and phylogenetic interpretations. To what extent are these assumptions valid? We combine our understanding of the fossil record of lobopodians with insights from decay experiments of modern onychophorans (velvet worms) to test these assumptions. Our analysis demonstrates that taphonomically informed tests of character interpretations have the potential to improve phylogenetic resolution. This approach is widely applicable to the fossil record - allowing us to ground-truth some of the assumptions involved in describing exceptionally preserved fossil material.

  13. Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    2014-01-01

    A multi-rate expression for uranyl [U(VI)] surface complexation reactions has been proposed to describe diffusion-limited U(VI) sorption/desorption in heterogeneous subsurface sediments. An important assumption in the rate expression is that its rate constants follow a certain type probability distribution. In this paper, a Bayes-based, Differential Evolution Markov Chain method was used to assess the distribution assumption and to analyze parameter and model structure uncertainties. U(VI) desorption from a contaminated sediment at the US Hanford 300 Area, Washington was used as an example for detail analysis. The results indicated that: 1) the rate constants in the multi-rate expression contain uneven uncertaintiesmore » with slower rate constants having relative larger uncertainties; 2) the lognormal distribution is an effective assumption for the rate constants in the multi-rate model to simualte U(VI) desorption; 3) however, long-term prediction and its uncertainty may be significantly biased by the lognormal assumption for the smaller rate constants; and 4) both parameter and model structure uncertainties can affect the extrapolation of the multi-rate model with a larger uncertainty from the model structure. The results provide important insights into the factors contributing to the uncertainties of the multi-rate expression commonly used to describe the diffusion or mixing-limited sorption/desorption of both organic and inorganic contaminants in subsurface sediments.« less

  14. Impact of viral load and the duration of primary infection on HIV transmission: systematic review and meta-analysis

    PubMed Central

    BLASER, Nello; WETTSTEIN, Celina; ESTILL, Janne; VIZCAYA, Luisa SALAZAR; WANDELER, Gilles; EGGER, Matthias; KEISER, Olivia

    2014-01-01

    Objectives HIV ‘treatment as prevention’ (TasP) describes early treatment of HIV-infected patients intended to reduce viral load (VL) and transmission. Crucial assumptions for estimating TasP's effectiveness are the underlying estimates of transmission risk. We aimed to determine transmission risk during primary infection, and of the relation of HIV transmission risk to VL. Design Systematic review and meta-analysis. Methods We searched PubMed and Embase databases for studies that established a relationship between VL and transmission risk, or primary infection and transmission risk, in serodiscordant couples. We analyzed assumptions about the relationship between VL and transmission risk, and between duration of primary infection and transmission risk. Results We found 36 eligible articles, based on six different study populations. Studies consistently found that larger VLs lead to higher HIV transmission rates, but assumptions about the shape of this increase varied from exponential increase to saturation. The assumed duration of primary infection ranged from 1.5 to 12 months; for each additional month, the log10 transmission rate ratio between primary and asymptomatic infection decreased by 0.40. Conclusions Assumptions and estimates of the relationship between VL and transmission risk, and the relationship between primary infection and transmission risk, vary substantially and predictions of TasP's effectiveness should take this uncertainty into account. PMID:24691205

  15. Identify temporal trend of air temperature and its impact on forest stream flow in Lower Mississippi River Alluvial Valley using wavelet analysis

    Treesearch

    Ying Ouyang; Prem B. Parajuli; Yide Li; Theodor D. Leininger; Gary Feng

    2017-01-01

    Characterization of stream flow is essential to water resource management, water supply planning, environmental protection, and ecological restoration; while air temperature variation due to climate change can exacerbate stream flow and add instability to the flow. In this study, the wavelet analysis technique was employed to identify temporal trend of air temperature...

  16. Missing data in trial-based cost-effectiveness analysis: An incomplete journey.

    PubMed

    Leurent, Baptiste; Gomes, Manuel; Carpenter, James R

    2018-06-01

    Cost-effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial-based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty-two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost-effectiveness data was 63% (interquartile range: 47%-81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing-at-random assumption. Further improvements are needed to address missing data in cost-effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing-at-random assumption. © 2018 The Authors Health Economics published by John Wiley & Sons Ltd.

  17. Response inhibition in Attention deficit disorder and neurofibromatosis type 1 – clinically similar, neurophysiologically different

    PubMed Central

    Bluschke, Annet; von der Hagen, Maja; Papenhagen, Katharina; Roessner, Veit; Beste, Christian

    2017-01-01

    There are large overlaps in cognitive deficits occurring in attention deficit disorder (ADD) and neurodevelopmental disorders like neurofibromatosis type 1 (NF1). This overlap is mostly based on clinical measures and not on in-depth analyses of neuronal mechanisms. However, the consideration of such neuronal underpinnings is crucial when aiming to integrate measures that can lead to a better understanding of the underlying mechanisms. Inhibitory control deficits, for example, are a hallmark in ADD, but it is unclear how far there are similar deficits in NF1. We thus compared adolescent ADD and NF1 patients to healthy controls in a Go/Nogo task using behavioural and neurophysiological measures. Clinical measures of ADD-symptoms were not different between ADD and NF1. Only patients with ADD showed increased Nogo errors and reductions in components reflecting response inhibition (i.e. Nogo-P3). Early perceptual processes (P1) were changed in ADD and NF1. Clinically, patients with ADD and NF1 thus show strong similarities. This is not the case in regard to underlying cognitive control processes. This shows that in-depth analyses of neurophysiological processes are needed to determine whether the overlap between ADD and NF1 is as strong as assumed and to develop appropriate treatment strategies. PMID:28262833

  18. Regression Analysis of a Disease Onset Distribution Using Diagnosis Data

    PubMed Central

    Young, Jessica G.; Jewell, Nicholas P.; Samuels, Steven J.

    2008-01-01

    Summary We consider methods for estimating the effect of a covariate on a disease onset distribution when the observed data structure consists of right-censored data on diagnosis times and current status data on onset times amongst individuals who have not yet been diagnosed. Dunson and Baird (2001, Biometrics 57, 306–403) approached this problem using maximum likelihood, under the assumption that the ratio of the diagnosis and onset distributions is monotonic nondecreasing. As an alternative, we propose a two-step estimator, an extension of the approach of van der Laan, Jewell, and Petersen (1997, Biometrika 84, 539–554) in the single sample setting, which is computationally much simpler and requires no assumptions on this ratio. A simulation study is performed comparing estimates obtained from these two approaches, as well as that from a standard current status analysis that ignores diagnosis data. Results indicate that the Dunson and Baird estimator outperforms the two-step estimator when the monotonicity assumption holds, but the reverse is true when the assumption fails. The simple current status estimator loses only a small amount of precision in comparison to the two-step procedure but requires monitoring time information for all individuals. In the data that motivated this work, a study of uterine fibroids and chemical exposure to dioxin, the monotonicity assumption is seen to fail. Here, the two-step and current status estimators both show no significant association between the level of dioxin exposure and the hazard for onset of uterine fibroids; the two-step estimator of the relative hazard associated with increasing levels of exposure has the least estimated variance amongst the three estimators considered. PMID:17680832

  19. Approximate Analysis for Interlaminar Stresses in Composite Structures with Thickness Discontinuities

    NASA Technical Reports Server (NTRS)

    Rose, Cheryl A.; Starnes, James H., Jr.

    1996-01-01

    An efficient, approximate analysis for calculating complete three-dimensional stress fields near regions of geometric discontinuities in laminated composite structures is presented. An approximate three-dimensional local analysis is used to determine the detailed local response due to far-field stresses obtained from a global two-dimensional analysis. The stress results from the global analysis are used as traction boundary conditions for the local analysis. A generalized plane deformation assumption is made in the local analysis to reduce the solution domain to two dimensions. This assumption allows out-of-plane deformation to occur. The local analysis is based on the principle of minimum complementary energy and uses statically admissible stress functions that have an assumed through-the-thickness distribution. Examples are presented to illustrate the accuracy and computational efficiency of the local analysis. Comparisons of the results of the present local analysis with the corresponding results obtained from a finite element analysis and from an elasticity solution are presented. These results indicate that the present local analysis predicts the stress field accurately. Computer execution-times are also presented. The demonstrated accuracy and computational efficiency of the analysis make it well suited for parametric and design studies.

  20. Robust Bayesian Factor Analysis

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Yuan, Ke-Hai

    2003-01-01

    Bayesian factor analysis (BFA) assumes the normal distribution of the current sample conditional on the parameters. Practical data in social and behavioral sciences typically have significant skewness and kurtosis. If the normality assumption is not attainable, the posterior analysis will be inaccurate, although the BFA depends less on the current…

  1. Recent advances in (soil moisture) triple collocation analysis

    USDA-ARS?s Scientific Manuscript database

    To date, triple collocation (TC) analysis is one of the most important methods for the global scale evaluation of remotely sensed soil moisture data sets. In this study we review existing implementations of soil moisture TC analysis as well as investigations of the assumptions underlying the method....

  2. The Use of Propensity Scores in Mediation Analysis

    ERIC Educational Resources Information Center

    Jo, Booil; Stuart, Elizabeth A.; MacKinnon, David P.; Vinokur, Amiram D.

    2011-01-01

    Mediation analysis uses measures of hypothesized mediating variables to test theory for how a treatment achieves effects on outcomes and to improve subsequent treatments by identifying the most efficient treatment components. Most current mediation analysis methods rely on untested distributional and functional form assumptions for valid…

  3. Impact of Anxiety and/or Depressive Disorders and Chronic Somatic Diseases on disability and work impairment.

    PubMed

    Bokma, Wicher A; Batelaan, Neeltje M; van Balkom, Anton J L M; Penninx, Brenda W J H

    2017-03-01

    Anxiety and/or Depressive Disorders (ADDs) and Chronic Somatic Diseases (CSDs) are associated with substantial levels of health-related disability and work impairment. However, it is unclear whether comorbid ADDs and CSDs additively affect functional outcomes. This paper examines the impact of ADDs, CSDs, and their comorbidity on disability, work absenteeism and presenteeism. Baseline data from the Netherlands Study of Depression and Anxiety (n=2371) were used. We assessed presence of current ADDs (using psychiatric interviews, CIDI) and presence of self-reported CSDs. Outcome measures were disability scores (WHO-DAS II questionnaire, overall and domain-specific), work absenteeism (≤2weeks and >2weeks; TiC-P) and presenteeism (reduced and impaired work performance; TiC-P). We conducted multivariate regression analyses adjusted for socio-demographics. Both ADDs and CSDs significantly and independently impact total disability, but the impact was substantially larger for ADDs (main effect unstandardized β=20.1, p<.001) than for CSDs (main effect unstandardized β=3.88, p<.001). There was a positive interaction between ADDs and CSDs on disability (unstandardized β interaction=4.06, p=.004). Although CSDs also induce absenteeism (OR for extended absenteeism=1.42, p=.015) and presenteeism (OR for impaired work performance=1.42, p=.013), associations with ADDs were stronger (OR for extended absenteeism=6.64, p<.001; OR for impaired work performance=7.51, p<.001). Both CSDs and ADDs cause substantial disability, work absenteeism and presenteeism, but the impact of ADDs far exceeds that of CSDs. CSDs and ADDs interact synergistically on disability, thereby bolstering the current view that patients with physical mental comorbidity (PM-comorbidity) form a severe subgroup with an unfavourable prognosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Pregabalin as mono- or add-on therapy for patients with refractory chronic neuropathic pain: a post-marketing prescription-event monitoring study.

    PubMed

    Lampl, Christian; Schweiger, Christine; Haider, Bernhard; Lechner, Anita

    2010-08-01

    This observational study examined the outcome of two different therapeutic strategies in the treatment of chronic neuropathic pain by including pregabalin (PGB) as mono- or add-on therapy in one of two treatment options. Patients with a pain score of > or =4, refractory to usual care for neuropathic pain for at least 6 months, were allocated consecutively to one of two treatment strategies according to the decision of the physician: complete switch to a flexible-dosage, monotherapeutic or add-on therapy with pregabalin (PGB group), or change established doses and combinations of pre-existing mono- or combination therapy without pregabalin (non-PGB group). After 4 weeks (primary endpoint) a significant improvement in pain reduction was documented in both intention-to treat (ITT) analysis (PGB group, n = 85: mean pain score reduction of 3.53, SD 2.03, p < 0.001; non-PGB group, n = 102; mean pain score reduction of 2.83, SD 2.23, p < 0.001) and per-protocol (PP) analysis (PGB group, n = 79: mean pain score reduction 3.53 vs. 2.83, p < 0.05; non-PGB group, n = 81; 3.5 vs. 2.9, p < 0.05) compared to baseline. Comparison of the results observed in the two groups shows that patients in the PGB group achieved significantly greater pain reduction. These results demonstrate that PGB administered twice daily is superior to treatment regimes without PGB in reducing pain and pain-related interference in quality of life.

  5. US adolescents’ friendship networks and health risk behaviors: a systematic review of studies using social network analysis and Add Health data

    PubMed Central

    Goodson, Patricia

    2015-01-01

    Background. Documented trends in health-related risk behaviors among US adolescents have remained high over time. Studies indicate relationships among mutual friends are a major influence on adolescents’ risky behaviors. Social Network Analysis (SNA) can help understand friendship ties affecting individual adolescents’ engagement in these behaviors. Moreover, a systematic literature review can synthesize findings from a range of studies using SNA, as well as assess these studies’ methodological quality. Review findings also can help health educators and promoters develop more effective programs. Objective. This review systematically examined studies of the influence of friendship networks on adolescents’ risk behaviors, which utilized SNA and the Add Health data (a nationally representative sample). Methods. We employed the Matrix Method to synthesize and evaluate 15 published studies that met our inclusion and exclusion criteria, retrieved from the Add Health website and 3 major databases (Medline, Eric, and PsycINFO). Moreover, we assigned each study a methodological quality score (MQS). Results. In all studies, friendship networks among adolescents promoted their risky behaviors, including drinking alcohol, smoking, sexual intercourse, and marijuana use. The average MQS was 4.6, an indicator of methodological rigor (scale: 1–9). Conclusion. Better understanding of risky behaviors influenced by friends can be useful for health educators and promoters, as programs targeting friendships might be more effective. Additionally, the overall MQ of these reviewed studies was good, as average scores fell above the scale’s mid-point. PMID:26157622

  6. Agent-based Training: Facilitating Knowledge and Skill Acquisition in a Modern Space Operations Team

    DTIC Science & Technology

    2002-04-01

    face, and being careful to not add to existing problems such as limited display space. This required us to work closely with members of the SBIRS operational community and use research tools such as cognitive task analysis methods.

  7. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  8. CHI during an ohmic discharge in HIT-II

    NASA Astrophysics Data System (ADS)

    Mueller, Dennis; Nelson, Brian A.; Redd, Aaron J.; Hamp, William T.

    2004-11-01

    Coaxial Helicity Injection (CHI) has been used on the National Spherical Torus Experiment (NSTX), the Helicity Injected Torus (HIT) and HIT-II to initiate plasma and to drive up to 400 kA of toroidal current. The primary goal of the CHI systems is to provide a start-up plasma with substantial toroidal current that can be heated and sustained with other methods. We have investigated the use of CHI systems to add current to an established, inductively driven plasma. This may be an attractive method to add edge current that may modify the stability characteristics of the discharge or modify the particle and energy transport in a spherical torus. For example, divertor biasing experiments have been successful in modifying particle and energy transport in the scrape-off layer of tokamaks. Use of IGBT power supplies to modulate the injector current makes analysis of current penetration feasible by comparisons of before and after CHI using EFIT analysis of the data.

  9. PDS4 vs PDS3 - A Comparison of PDS Data for Two Mars Rovers - Existing Mars Curiosity Mission Mass Spectrometer (SAM) PDS3 Data vs Future ExoMars Rover Mass Spectrometer (MOMA) PDS4 Data

    NASA Astrophysics Data System (ADS)

    Lyness, E.; Franz, H. B.; Prats, B.

    2017-12-01

    The Sample Analysis at Mars (SAM) instrument is a suite of instruments on Mars aboard the Mars Science Laboratory rover. Centered on a mass spectrometer, SAM delivers its data to the PDS Atmosphere's node in PDS3 format. Over five years on Mars the process of operating SAM has evolved and extended significantly from the plan in place at the time the PDS3 delivery specification was written. For instance, SAM commonly receives double or even triple sample aliquots from the rover's drill. SAM also stores samples in spare cups for long periods of time for future analysis. These unanticipated operational changes mean that the PDS data deliveries are absent some valuable metadata without which the data can be confusing. The Mars Organic Molecule Analyzer (MOMA) instrument is another suite of instruments centered on a mass spectrometer bound for Mars. MOMA is part of the European ExoMars rover mission schedule to arrive on Mars in 2021. While SAM and MOMA differ in some important scientific ways - MOMA uses an linear ion trap compared to the SAM quadropole mass spectrometer and MOMA has a laser desorption experiment that SAM lacks - the data content from the PDS point of view is comparable. Both instruments produce data containing mass spectra acquired from solid samples collected on the surface of Mars. The MOMA PDS delivery will make use of PDS4 improvements to provide a metadata context to the data. The MOMA PDS4 specification makes few assumptions of the operational processes. Instead it provides a means for the MOMA operators to provide the important contextual metadata that was unanticipated during specification development. Further, the software tools being developed for instrument operators will provide a means for the operators to add this crucial metadata at the time it is best know - during operations.

  10. Resistivity method contribution in determining of fault zone and hydro-geophysical characteristics of carbonate aquifer, eastern desert, Egypt

    NASA Astrophysics Data System (ADS)

    Ammar, A. I.; Kamal, K. A.

    2018-03-01

    Determination of fault zone and hydro-geophysical characteristics of the fractured aquifers are complicated, because their fractures are controlled by different factors. Therefore, 60 VESs were carried out as well as 17 productive wells for determining the locations of the fault zones and the characteristics of the carbonate aquifer at the eastern desert, Egypt. The general curve type of the recorded rock units was QKH. These curves were used in delineating the zones of faults according to the application of the new assumptions. The main aquifer was included at end of the K-curve type and front of the H-curve type. The subsurface layers classified into seven different geoelectric layers. The fractured shaly limestone and fractured limestone layers were the main aquifer and their resistivity changed from low to medium (11-93 Ω m). The hydro-geophysical properties of this aquifer such as the areas of very high, high, and intermediate fracture densities of high groundwater accumulations, salinity, shale content, porosity distribution, and recharging and flowing of groundwater were determined. The statistical analysis appeared that depending of aquifer resistivity on the water salinities (T.D.S.) and water resistivities add to the fracture density and shale content. The T.D.S. increasing were controlled by Na+, Cl-, Ca2+, Mg2+, and then (SO4)2-, respectively. The porosity was calculated and its average value was 19%. The hydrochemical analysis of groundwater appeared that its type was brackish and the arrangements of cation concentrations were Na+ > Ca2+ > Mg2+ > K+ and anion concentrations were Cl- > (SO4)2- > HCO3 - > CO3 -. The groundwater was characterized by sodium-bicarbonate and sodium-sulfate genetic water types and meteoric in origin. Hence, it can use the DC-resistivity method in delineating the fault zone and determining the hydro-geophysical characteristics of the fractured aquifer with taking into account the quality of measurements and interpretation.

  11. Integrating the dimensions of sex and gender into basic life sciences research: methodologic and ethical issues.

    PubMed

    Holdcroft, Anita

    2007-01-01

    The research process -- from study design and selecting a species and its husbandry, through the experiment, analysis, peer review, and publication -- is rarely subject to questions about sex or gender differences in mainstream life sciences research. However, the impact of sex and gender on these processes is important in explaining biological variations and presentation of symptoms and diseases. This review aims to challenge assumptions and to develop opportunities to mainstream sex and gender in basic scientific research. Questions about the mechanisms of sex and gender effects were reviewed in relation to biological, environmental, social, and psychological interactions. Gender variations, in respect to aging, socializing, and reproduction, that are present in human populations but are rarely featured in laboratory research were considered to more effectively translate animal research into clinical health care. Methodologic approaches to address the present lack of a gender dimension in research include actively reducing variations through attention to physical factors, biological rhythms, and experimental design. In addition, through genomic and acute nongenomic activity, hormones may compound effects through multiple small sex differences that occur during the course of an acute pathologic event. Furthermore, the many exogenous sex steroid hormones and their congeners used in medicine (eg, in contraception and cancer therapies) may add to these effects. The studies reviewed provide evidence that sex and gender are determinants of many outcomes in life science research. To embed the gender dimension into basic scientific research, a broad approach -- gender mainstreaming -- is warranted. One example is the use of review boards (eg, animal ethical review boards and journal peer-review boards) in which gender-related standardized questions can be asked about study design and analysis. A more fundamental approach is to question the relevance of present-day laboratory models to design methods to best represent the age-related changes, comorbidity, and variations experienced by each sex in clinical medicine.

  12. MODELING SNAKE MICROHABITAT FROM RADIOTELEMETRY STUDIES USING POLYTOMOUS LOGISTIC REGRESSION

    EPA Science Inventory

    Multivariate analysis of snake microhabitat has historically used techniques that were derived under assumptions of normality and common covariance structure (e.g., discriminant function analysis, MANOVA). In this study, polytomous logistic regression (PLR which does not require ...

  13. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  14. Critical assessment of inverse gas chromatography as means of assessing surface free energy and acid-base interaction of pharmaceutical powders.

    PubMed

    Telko, Martin J; Hickey, Anthony J

    2007-10-01

    Inverse gas chromatography (IGC) has been employed as a research tool for decades. Despite this record of use and proven utility in a variety of applications, the technique is not routinely used in pharmaceutical research. In other fields the technique has flourished. IGC is experimentally relatively straightforward, but analysis requires that certain theoretical assumptions are satisfied. The assumptions made to acquire some of the recently reported data are somewhat modified compared to initial reports. Most publications in the pharmaceutical literature have made use of a simplified equation for the determination of acid/base surface properties resulting in parameter values that are inconsistent with prior methods. In comparing the surface properties of different batches of alpha-lactose monohydrate, new data has been generated and compared with literature to allow critical analysis of the theoretical assumptions and their importance to the interpretation of the data. The commonly used (simplified) approach was compared with the more rigorous approach originally outlined in the surface chemistry literature. (c) 2007 Wiley-Liss, Inc.

  15. Critical frontier of the triangular Ising antiferromagnet in a field

    NASA Astrophysics Data System (ADS)

    Qian, Xiaofeng; Wegewijs, Maarten; Blöte, Henk W.

    2004-03-01

    We study the critical line of the triangular Ising antiferromagnet in an external magnetic field by means of a finite-size analysis of results obtained by transfer-matrix and Monte Carlo techniques. We compare the shape of the critical line with predictions of two different theoretical scenarios. Both scenarios, while plausible, involve assumptions. The first scenario is based on the generalization of the model to a vertex model, and the assumption that the exact analytic form of the critical manifold of this vertex model is determined by the zeroes of an O(2) gauge-invariant polynomial in the vertex weights. However, it is not possible to fit the coefficients of such polynomials of orders up to 10, such as to reproduce the numerical data for the critical points. The second theoretical prediction is based on the assumption that a renormalization mapping exists of the Ising model on the Coulomb gas, and analysis of the resulting renormalization equations. It leads to a shape of the critical line that is inconsistent with the first prediction, but consistent with the numerical data.

  16. Testing Mean Differences among Groups: Multivariate and Repeated Measures Analysis with Minimal Assumptions

    PubMed Central

    Bathke, Arne C.; Friedrich, Sarah; Pauly, Markus; Konietschke, Frank; Staffen, Wolfgang; Strobl, Nicolas; Höller, Yvonne

    2018-01-01

    ABSTRACT To date, there is a lack of satisfactory inferential techniques for the analysis of multivariate data in factorial designs, when only minimal assumptions on the data can be made. Presently available methods are limited to very particular study designs or assume either multivariate normality or equal covariance matrices across groups, or they do not allow for an assessment of the interaction effects across within-subjects and between-subjects variables. We propose and methodologically validate a parametric bootstrap approach that does not suffer from any of the above limitations, and thus provides a rather general and comprehensive methodological route to inference for multivariate and repeated measures data. As an example application, we consider data from two different Alzheimer’s disease (AD) examination modalities that may be used for precise and early diagnosis, namely, single-photon emission computed tomography (SPECT) and electroencephalogram (EEG). These data violate the assumptions of classical multivariate methods, and indeed classical methods would not have yielded the same conclusions with regards to some of the factors involved. PMID:29565679

  17. A pattern-mixture model approach for handling missing continuous outcome data in longitudinal cluster randomized trials.

    PubMed

    Fiero, Mallorie H; Hsu, Chiu-Hsieh; Bell, Melanie L

    2017-11-20

    We extend the pattern-mixture approach to handle missing continuous outcome data in longitudinal cluster randomized trials, which randomize groups of individuals to treatment arms, rather than the individuals themselves. Individuals who drop out at the same time point are grouped into the same dropout pattern. We approach extrapolation of the pattern-mixture model by applying multilevel multiple imputation, which imputes missing values while appropriately accounting for the hierarchical data structure found in cluster randomized trials. To assess parameters of interest under various missing data assumptions, imputed values are multiplied by a sensitivity parameter, k, which increases or decreases imputed values. Using simulated data, we show that estimates of parameters of interest can vary widely under differing missing data assumptions. We conduct a sensitivity analysis using real data from a cluster randomized trial by increasing k until the treatment effect inference changes. By performing a sensitivity analysis for missing data, researchers can assess whether certain missing data assumptions are reasonable for their cluster randomized trial. Copyright © 2017 John Wiley & Sons, Ltd.

  18. A capture-recapture survival analysis model for radio-tagged animals

    USGS Publications Warehouse

    Pollock, K.H.; Bunck, C.M.; Winterstein, S.R.; Chen, C.-L.; North, P.M.; Nichols, J.D.

    1995-01-01

    In recent years, survival analysis of radio-tagged animals has developed using methods based on the Kaplan-Meier method used in medical and engineering applications (Pollock et al., 1989a,b). An important assumption of this approach is that all tagged animals with a functioning radio can be relocated at each sampling time with probability 1. This assumption may not always be reasonable in practice. In this paper, we show how a general capture-recapture model can be derived which allows for some probability (less than one) for animals to be relocated. This model is not simply a Jolly-Seber model because it is possible to relocate both dead and live animals, unlike when traditional tagging is used. The model can also be viewed as a generalization of the Kaplan-Meier procedure, thus linking the Jolly-Seber and Kaplan-Meier approaches to survival estimation. We present maximum likelihood estimators and discuss testing between submodels. We also discuss model assumptions and their validity in practice. An example is presented based on canvasback data collected by G. M. Haramis of Patuxent Wildlife Research Center, Laurel, Maryland, USA.

  19. Causal Models for Mediation Analysis: An Introduction to Structural Mean Models.

    PubMed

    Zheng, Cheng; Atkins, David C; Zhou, Xiao-Hua; Rhew, Isaac C

    2015-01-01

    Mediation analyses are critical to understanding why behavioral interventions work. To yield a causal interpretation, common mediation approaches must make an assumption of "sequential ignorability." The current article describes an alternative approach to causal mediation called structural mean models (SMMs). A specific SMM called a rank-preserving model (RPM) is introduced in the context of an applied example. Particular attention is given to the assumptions of both approaches to mediation. Applying both mediation approaches to the college student drinking data yield notable differences in the magnitude of effects. Simulated examples reveal instances in which the traditional approach can yield strongly biased results, whereas the RPM approach remains unbiased in these cases. At the same time, the RPM approach has its own assumptions that must be met for correct inference, such as the existence of a covariate that strongly moderates the effect of the intervention on the mediator and no unmeasured confounders that also serve as a moderator of the effect of the intervention or the mediator on the outcome. The RPM approach to mediation offers an alternative way to perform mediation analysis when there may be unmeasured confounders.

  20. Propensity score estimation: machine learning and classification methods as alternatives to logistic regression

    PubMed Central

    Westreich, Daniel; Lessler, Justin; Funk, Michele Jonsson

    2010-01-01

    Summary Objective Propensity scores for the analysis of observational data are typically estimated using logistic regression. Our objective in this Review was to assess machine learning alternatives to logistic regression which may accomplish the same goals but with fewer assumptions or greater accuracy. Study Design and Setting We identified alternative methods for propensity score estimation and/or classification from the public health, biostatistics, discrete mathematics, and computer science literature, and evaluated these algorithms for applicability to the problem of propensity score estimation, potential advantages over logistic regression, and ease of use. Results We identified four techniques as alternatives to logistic regression: neural networks, support vector machines, decision trees (CART), and meta-classifiers (in particular, boosting). Conclusion While the assumptions of logistic regression are well understood, those assumptions are frequently ignored. All four alternatives have advantages and disadvantages compared with logistic regression. Boosting (meta-classifiers) and to a lesser extent decision trees (particularly CART) appear to be most promising for use in the context of propensity score analysis, but extensive simulation studies are needed to establish their utility in practice. PMID:20630332

  1. A State Space Modeling Approach to Mediation Analysis

    ERIC Educational Resources Information Center

    Gu, Fei; Preacher, Kristopher J.; Ferrer, Emilio

    2014-01-01

    Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudinal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation…

  2. Commentary: Using Potential Outcomes to Understand Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke; Jo, Booil; Stuart, Elizabeth A.

    2011-01-01

    In this commentary, we demonstrate how the potential outcomes framework can help understand the key identification assumptions underlying causal mediation analysis. We show that this framework can lead to the development of alternative research design and statistical analysis strategies applicable to the longitudinal data settings considered by…

  3. Modelling income distribution impacts of water sector projects in Bangladesh.

    PubMed

    Ahmed, C S; Jones, S

    1991-09-01

    Dynamic analysis was conducted to assess the long-term impacts of water sector projects on agricultural income distribution, and sensitivity analysis was conducted to check the robustness of the 5 assumptions in this study of income distribution and water sector projects in Bangladesh. 7 transitions are analyzed for mutually exclusive irrigation and flooding projects: Nonirrigation to 1) LLP irrigation, 2) STW irrigation, 3) DTW irrigation, 4) major gravity irrigation, and manually operated shallow tubewell irrigation (MOSTI) and Flood Control Projects (FCD) of 6) medium flooded to shallow flooded, and 7) deeply flooded to shallow flooded. 5 analytical stages are involved: 1) farm budgets are derived with and without project cropping patterns for each transition. 2) Estimates are generated for value added/hectare from each transition. 3) Assumptions are made about the number of social classes, distribution of land ownership between classes, extent of tenancy for each social class, term of tenancy contracts, and extent of hiring of labor for each social class. 4) Annual value added/hectare is distributed among social classes. 5) Using Gini coefficients and simple ratios, the distribution of income between classes is estimated for with and without transition. Assumption I is that there are 4 social classes defined by land acreage: large farmers (5 acres), medium farmers (1.5-5.0), small farmers, (.01-1.49), and landless. Assumption II is that land distribution follows the 1978 Land Occupancy Survey (LOS). Biases, if any, are indicated. Assumption III is that large farmers sharecrop out 15% of land to small farmers. Assumption IV is that landlords provide nonirrigated crop land and take 50% of the crop, and, under irrigation, provide 50% of the fertilizer, pesticide, and irrigation costs and take 50% of the crop. Assumption V is that hired and family labor is assumed to be 40% for small farmers, 60% for medium farmers, and 80% for large farmers. It is understood that the analysis is partially complete, since there if no Assessment of the impact on nonagricultural income and employment, or secondary impacts such as demand for irrigation equipment, services for processing, manufacture and transport services, or investment of new agricultural surpluses. Few empirical studies have been done and the estimates apply only to individual project areas. The results show that inequality is greatest with major (gravity) irrigation, followed by STW, DTW and LLP, FCD (medium to shallow), FCD (deep to shallow), and the most equitable MOSTI. Changes in the absolute income accruing to the rural poor would lead to the rank of major gravity irrigation as raising more above the poverty line, followed by MOSTI, minor irritation (STW, DTW, and LLP), and FCD schemes.

  4. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    PubMed

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  5. On Recruiting: A Multivariate Analysis of Marine Corps Recruiters and the Market

    DTIC Science & Technology

    Three recommendations result from this study. The quantitative recommendation developed in this thesis is to add approximately three missioned...canvassing recruiters per Recruiting Station, or 144 total, where the marginal cost of the 1,400 potentially gained contracts is the most economical manpower

  6. Medicating for ADD/ADHD: Personal and Social Issues

    ERIC Educational Resources Information Center

    Davis-Berman, Jennifer L.; Pestello, Frances G.

    2010-01-01

    Twenty college students from a private Midwestern university were interviewed about their past and present experiences with taking medication for Attention Deficit Disorder. Analysis of respondent interviews suggested the following themes that were discussed and analyzed: recruitment of the young, little personal stigma, societal issues, side…

  7. Encouraging Gender Analysis in Research Practice

    ERIC Educational Resources Information Center

    Thien, Deborah

    2009-01-01

    Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…

  8. 40 CFR Appendix A to Part 136 - Methods for Organic Chemical Analysis of Municipal and Industrial Wastewater

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the learest 0.1 mg. 6.5.2Add the assayed reference material: 6.5.2.1Liquid—Using a 100 µL syringe... chromatograph and all required accessories including syringes, analytical columns, and gases. The injection port...

  9. 40 CFR Appendix A to Part 136 - Methods for Organic Chemical Analysis of Municipal and Industrial Wastewater

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the learest 0.1 mg. 6.5.2Add the assayed reference material: 6.5.2.1Liquid—Using a 100 µL syringe... chromatograph and all required accessories including syringes, analytical columns, and gases. The injection port...

  10. Estimation of kinematic parameters in CALIFA galaxies: no-assumption on internal dynamics

    NASA Astrophysics Data System (ADS)

    García-Lorenzo, B.; Barrera-Ballesteros, J.; CALIFA Team

    2016-06-01

    We propose a simple approach to homogeneously estimate kinematic parameters of a broad variety of galaxies (elliptical, spirals, irregulars or interacting systems). This methodology avoids the use of any kinematical model or any assumption on internal dynamics. This simple but novel approach allows us to determine: the frequency of kinematic distortions, systemic velocity, kinematic center, and kinematic position angles which are directly measured from the two dimensional-distributions of radial velocities. We test our analysis tools using the CALIFA Survey

  11. Methodology for estimating helicopter performance and weights using limited data

    NASA Technical Reports Server (NTRS)

    Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard

    1990-01-01

    Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.

  12. 77 FR 56896 - Self-Regulatory Organizations; EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-14

    ... the relevant flags, as described below, for orders that add liquidity to the EDGA book. Specifically... the following flags: Flag B for orders that add liquidity to the EDGA book in Tape B securities; Flag V for orders that add liquidity to the EDGA book in Tape A securities; Flag Y for orders that add...

  13. 40 CFR 63.4167 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system and add-on control device operating limits during the performance test? 63.4167 Section 63.4167... Emission Rate with Add-on Controls Option § 63.4167 How do I establish the emission capture system and add-on control device operating limits during the performance test? During the performance test required...

  14. Finite element techniques in computational time series analysis of turbulent flows

    NASA Astrophysics Data System (ADS)

    Horenko, I.

    2009-04-01

    In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.

  15. Can organizations benefit from worksite health promotion?

    PubMed Central

    Leviton, L C

    1989-01-01

    A decision-analytic model was developed to project the future effects of selected worksite health promotion activities on employees' likelihood of chronic disease and injury and on employer costs due to illness. The model employed a conservative set of assumptions and a limited five-year time frame. Under these assumptions, hypertension control and seat belt campaigns prevent a substantial amount of illness, injury, and death. Sensitivity analysis indicates that these two programs pay for themselves and under some conditions show a modest savings to the employer. Under some conditions, smoking cessation programs pay for themselves, preventing a modest amount of illness and death. Cholesterol reduction by behavioral means does not pay for itself under these assumptions. These findings imply priorities in prevention for employer and employee alike. PMID:2499556

  16. The Weberian Legacy of Thom Greenfield.

    ERIC Educational Resources Information Center

    Samier, Eugenie

    1996-01-01

    Traces through Thomas Greenfield's work his use of Max Weber's interpretive social analysis, including Weber's view of the individual unit of analysis, value topologies, comparative history methods, and analytical ideal topologies. Compares Greenfield's and Weber's metaphysical assumptions, ontological perspectives, and epistemological frameworks.…

  17. Adding biofuel/bioproduct capacity to existing U.S. mills. Part 1, Options : Agenda 2020 analysis charts a course.

    Treesearch

    Tom Belin; Craig Brown; Eric Connor; Jim Frederick; Peter Ince; Ryan Katofsky; Gerard Closset

    2008-01-01

    The chief technology officers of the American Forest & Paper Association’s Agenda 2020 Technology Alliance recently conducted an analysis of the most feasible and effective routes for forest products facilities in this country to add energy, biofuels and bio-based chemicals to their existing product streams. Considering that at least 21 billion gallons of the...

  18. Rotor systems research aircraft predesign study. Volume 3: Predesign report

    NASA Technical Reports Server (NTRS)

    Schmidt, S. A.; Linden, A. W.

    1972-01-01

    The features of two aircraft designs were selected to be included in the single RSRA configuration. A study was conducted for further preliminary design and a more detailed analysis of development plans and costs. An analysis was also made of foreseeable technical problems and risks, identification of parallel research which would reduce risks and/or add to the basic capability of the aircraft, and a draft aircraft specification.

  19. GEO Collisional Risk Assessment Based on Analysis of NASA-WISE Data and Modeling

    DTIC Science & Technology

    2015-10-18

    GEO Collisional Risk Assessment Based on Analysis of NASA -WISE Data and Modeling Jeremy Murray Krezan1, Samantha Howard1, Phan D. Dao1, Derek...Surka2 1AFRL Space Vehicles Directorate,2Applied Technology Associates Incorporated From December 2009 through 2011 the NASA Wide-Field Infrared...of known debris. The NASA -WISE GEO belt debris population adds potentially thousands previously uncataloged objects. This paper describes

  20. The Teacher, the Physician and the Person: Exploring Causal Connections between Teaching Performance and Role Model Types Using Directed Acyclic Graphs

    PubMed Central

    Boerebach, Benjamin C. M.; Lombarts, Kiki M. J. M. H.; Scherpbier, Albert J. J.; Arah, Onyebuchi A.

    2013-01-01

    Background In fledgling areas of research, evidence supporting causal assumptions is often scarce due to the small number of empirical studies conducted. In many studies it remains unclear what impact explicit and implicit causal assumptions have on the research findings; only the primary assumptions of the researchers are often presented. This is particularly true for research on the effect of faculty’s teaching performance on their role modeling. Therefore, there is a need for robust frameworks and methods for transparent formal presentation of the underlying causal assumptions used in assessing the causal effects of teaching performance on role modeling. This study explores the effects of different (plausible) causal assumptions on research outcomes. Methods This study revisits a previously published study about the influence of faculty’s teaching performance on their role modeling (as teacher-supervisor, physician and person). We drew eight directed acyclic graphs (DAGs) to visually represent different plausible causal relationships between the variables under study. These DAGs were subsequently translated into corresponding statistical models, and regression analyses were performed to estimate the associations between teaching performance and role modeling. Results The different causal models were compatible with major differences in the magnitude of the relationship between faculty’s teaching performance and their role modeling. Odds ratios for the associations between teaching performance and the three role model types ranged from 31.1 to 73.6 for the teacher-supervisor role, from 3.7 to 15.5 for the physician role, and from 2.8 to 13.8 for the person role. Conclusions Different sets of assumptions about causal relationships in role modeling research can be visually depicted using DAGs, which are then used to guide both statistical analysis and interpretation of results. Since study conclusions can be sensitive to different causal assumptions, results should be interpreted in the light of causal assumptions made in each study. PMID:23936020

  1. Public-private partnerships to improve primary healthcare surgeries: clarifying assumptions about the role of private provider activities.

    PubMed

    Mudyarabikwa, Oliver; Tobi, Patrick; Regmi, Krishna

    2017-07-01

    Aim To examine assumptions about public-private partnership (PPP) activities and their role in improving public procurement of primary healthcare surgeries. PPPs were developed to improve the quality of care and patient satisfaction. However, evidence of their effectiveness in delivering health benefits is limited. A qualitative study design was employed. A total of 25 interviews with public sector staff (n=23) and private sector managers (n=2) were conducted to understand their interpretations of assumptions in the activities of private investors and service contractors participating in Local Improvement Finance Trust (LIFT) partnerships. Realist evaluation principles were applied in the data analysis to interpret the findings. Six thematic areas of assumed health benefits were identified: (i) quality improvement; (ii) improved risk management; (iii) reduced procurement costs; (iv) increased efficiency; (v) community involvement; and (vi) sustainable investment. Primary Care Trusts that chose to procure their surgeries through LIFT were expected to support its implementation by providing an environment conducive for the private participants to achieve these benefits. Private participant activities were found to be based on a range of explicit and tacit assumptions perceived helpful in achieving government objectives for LIFT. The success of PPPs depended upon private participants' (i) capacity to assess how PPP assumptions added value to their activities, (ii) effectiveness in interpreting assumptions in their expected activities, and (iii) preparedness to align their business principles to government objectives for PPPs. They risked missing some of the expected benefits because of some factors constraining realization of the assumptions. The ways in which private participants preferred to carry out their activities also influenced the extent to which expected benefits were achieved. Giving more discretion to public than private participants over critical decisions may help in ensuring that assumptions in PPP activities result in outcomes that match the anticipated health benefits.

  2. Experiments on Nucleation in Different Flow Regimes

    NASA Technical Reports Server (NTRS)

    Bayuzick, Robert J.

    1999-01-01

    The vast majority of metallic engineering materials are solidified from the liquid phase. Understanding the solidification process is essential to control microstructure, which in turn, determines the properties of materials. The genesis of solidification is nucleation, where the first stable solid forms from the liquid phase. Nucleation kinetics determine the degree of undercooling and phase selection. As such, it is important to understand nucleation phenomena in order to control solidification or glass formation in metals and alloys. Early experiments in nucleation kinetics were accomplished by droplet dispersion methods [1-6]. Dilitometry was used by Turnbull and others, and more recently differential thermal analysis and differential scanning calorimetry have been used for kinetic studies. These techniques have enjoyed success; however, there are difficulties with these experiments. Since materials are dispersed in a medium, the character of the emulsion/metal interface affects the nucleation behavior. Statistics are derived from the large number of particles observed in a single experiment, but dispersions have a finite size distribution which adds to the uncertainty of the kinetic determinations. Even though temperature can be controlled quite well before the onset of nucleation, the release of the latent heat of fusion during nucleation of particles complicates the assumption of isothermality during these experiments. Containerless processing has enabled another approach to the study of nucleation kinetics [7]. With levitation techniques it is possible to undercool one sample to nucleation repeatedly in a controlled manner, such that the statistics of the nucleation process can be derived from multiple experiments on a single sample. The authors have fully developed the analysis of nucleation experiments on single samples following the suggestions of Skripov [8]. The advantage of these experiments is that the samples are directly observable. The nucleation temperature can be measured by noncontact optical pyrometry, the mass of the sample is known, and post processing analysis can be conducted on the sample. The disadvantages are that temperature measurement must have exceptionally high precision, and it is not possible to isolate specific heterogeneous sites as in droplet dispersions.

  3. MicroRNA Expression in Alpha and Beta Cells of Human Pancreatic Islets

    PubMed Central

    Vargas, Nancy; Rosero, Samuel; Piroso, Julieta; Ichii, Hirohito; Umland, Oliver; Zhijie, Jiang; Tsinoremas, Nicholas; Ricordi, Camillo; Inverardi, Luca; Domínguez-Bendala, Juan; Pastori, Ricardo L.

    2013-01-01

    microRNAs (miRNAs) play an important role in pancreatic development and adult β-cell physiology. Our hypothesis is based on the assumption that each islet cell type has a specific pattern of miRNA expression. We sought to determine the profile of miRNA expression in α-and β-cells, the main components of pancreatic islets, because this analysis may lead to a better understanding of islet gene regulatory pathways. Highly enriched (>98%) subsets of human α-and β-cells were obtained by flow cytometric sorting after intracellular staining with c-peptide and glucagon antibody. The method of sorting based on intracellular staining is possible because miRNAs are stable after fixation. MiRNA expression levels were determined by quantitative high throughput PCR-based miRNA array platform screening. Most of the miRNAs were preferentially expressed in β-cells. From the total of 667 miRNAs screened, the Significant Analysis of Microarray identified 141 miRNAs, of which only 7 were expressed more in α-cells (α-miRNAs) and 134 were expressed more in β-cells (β-miRNAs). Bioinformatic analysis identified potential targets of β-miRNAs analyzing the Beta Cell Gene Atlas, described in the T1Dbase, the web platform, supporting the type 1 diabetes (T1D) community. cMaf, a transcription factor regulating glucagon expression expressed selectively in α-cells (TFα) is targeted by β-miRNAs; miR-200c, miR-125b and miR-182. Min6 cells treated with inhibitors of these miRNAs show an increased expression of cMaf RNA. Conversely, over expression of miR-200c, miR-125b or miR-182 in the mouse alpha cell line αTC6 decreases the level of cMAF mRNA and protein. MiR-200c also inhibits the expression of Zfpm2, a TFα that inhibits the PI3K signaling pathway, at both RNA and protein levels. In conclusion, we identified miRNAs differentially expressed in pancreatic α- and β-cells and their potential transcription factor targets that could add new insights into different aspects of islet biology and pathophysiology. PMID:23383059

  4. MicroRNA expression in alpha and beta cells of human pancreatic islets.

    PubMed

    Klein, Dagmar; Misawa, Ryosuke; Bravo-Egana, Valia; Vargas, Nancy; Rosero, Samuel; Piroso, Julieta; Ichii, Hirohito; Umland, Oliver; Zhijie, Jiang; Tsinoremas, Nicholas; Ricordi, Camillo; Inverardi, Luca; Domínguez-Bendala, Juan; Pastori, Ricardo L

    2013-01-01

    microRNAs (miRNAs) play an important role in pancreatic development and adult β-cell physiology. Our hypothesis is based on the assumption that each islet cell type has a specific pattern of miRNA expression. We sought to determine the profile of miRNA expression in α-and β-cells, the main components of pancreatic islets, because this analysis may lead to a better understanding of islet gene regulatory pathways. Highly enriched (>98%) subsets of human α-and β-cells were obtained by flow cytometric sorting after intracellular staining with c-peptide and glucagon antibody. The method of sorting based on intracellular staining is possible because miRNAs are stable after fixation. MiRNA expression levels were determined by quantitative high throughput PCR-based miRNA array platform screening. Most of the miRNAs were preferentially expressed in β-cells. From the total of 667 miRNAs screened, the Significant Analysis of Microarray identified 141 miRNAs, of which only 7 were expressed more in α-cells (α-miRNAs) and 134 were expressed more in β-cells (β-miRNAs). Bioinformatic analysis identified potential targets of β-miRNAs analyzing the Beta Cell Gene Atlas, described in the T1Dbase, the web platform, supporting the type 1 diabetes (T1D) community. cMaf, a transcription factor regulating glucagon expression expressed selectively in α-cells (TFα) is targeted by β-miRNAs; miR-200c, miR-125b and miR-182. Min6 cells treated with inhibitors of these miRNAs show an increased expression of cMaf RNA. Conversely, over expression of miR-200c, miR-125b or miR-182 in the mouse alpha cell line αTC6 decreases the level of cMAF mRNA and protein. MiR-200c also inhibits the expression of Zfpm2, a TFα that inhibits the PI3K signaling pathway, at both RNA and protein levels.In conclusion, we identified miRNAs differentially expressed in pancreatic α- and β-cells and their potential transcription factor targets that could add new insights into different aspects of islet biology and pathophysiology.

  5. Consistency tests for the extraction of the Boer-Mulders and Sivers functions

    NASA Astrophysics Data System (ADS)

    Christova, E.; Leader, E.; Stoilov, M.

    2018-03-01

    At present, the Boer-Mulders (BM) function for a given quark flavor is extracted from data on semi-inclusive deep inelastic scattering (SIDIS) using the simplifying assumption that it is proportional to the Sivers function for that flavor. In a recent paper, we suggested that the consistency of this assumption could be tested using information on so-called difference asymmetries i.e. the difference between the asymmetries in the production of particles and their antiparticles. In this paper, using the SIDIS COMPASS deuteron data on the ⟨cos ϕh⟩ , ⟨cos 2 ϕh⟩ and Sivers difference asymmetries, we carry out two independent consistency tests of the assumption of proportionality, but here applied to the sum of the valence-quark contributions. We find that such an assumption is compatible with the data. We also show that the proportionality assumptions made in the existing parametrizations of the BM functions are not compatible with our analysis, which suggests that the published results for the Boer-Mulders functions for individual flavors are unreliable. The ⟨cos ϕh⟩ and ⟨cos 2 ϕh⟩ asymmetries receive contributions also from the, in principle, calculable Cahn effect. We succeed in extracting the Cahn contributions from experiment (we believe for the first time) and compare with their calculated values, with interesting implications.

  6. Alternative Fuels Data Center: Ozinga Adds 14 Natural Gas Concrete Mixers

    Science.gov Websites

    to Its Fleet Ozinga Adds 14 Natural Gas Concrete Mixers to Its Fleet to someone by E-mail Share Alternative Fuels Data Center: Ozinga Adds 14 Natural Gas Concrete Mixers to Its Fleet on Facebook Tweet about Alternative Fuels Data Center: Ozinga Adds 14 Natural Gas Concrete Mixers to Its Fleet on Twitter Bookmark

  7. 40 CFR 63.4767 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system and add-on control device operating limits during the performance test? 63.4767 Section 63.4767... Rate with Add-on Controls Option § 63.4767 How do I establish the emission capture system and add-on control device operating limits during the performance test? During the performance test required by § 63...

  8. 40 CFR 63.3546 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... system and add-on control device operating limits during the performance test? 63.3546 Section 63.3546... device or system of multiple capture devices. The average duct static pressure is the maximum operating... Add-on Controls Option § 63.3546 How do I establish the emission capture system and add-on control...

  9. 40 CFR 63.4966 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system and add-on control device operating limits during the performance test? 63.4966 Section 63.4966... outlet gas temperature is the maximum operating limit for your condenser. (e) Emission capture system... Emission Rate with Add-on Controls Option § 63.4966 How do I establish the emission capture system and add...

  10. 40 CFR 63.3546 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system and add-on control device operating limits during the performance test? 63.3546 Section 63.3546... device or system of multiple capture devices. The average duct static pressure is the maximum operating... Add-on Controls Option § 63.3546 How do I establish the emission capture system and add-on control...

  11. 40 CFR 63.4767 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... system and add-on control device operating limits during the performance test? 63.4767 Section 63.4767... Rate with Add-on Controls Option § 63.4767 How do I establish the emission capture system and add-on control device operating limits during the performance test? During the performance test required by § 63...

  12. 40 CFR 63.4966 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system and add-on control device operating limits during the performance test? 63.4966 Section 63.4966... outlet gas temperature is the maximum operating limit for your condenser. (e) Emission capture system... with Add-on Controls Option § 63.4966 How do I establish the emission capture system and add-on control...

  13. 40 CFR 63.4966 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... system and add-on control device operating limits during the performance test? 63.4966 Section 63.4966... outlet gas temperature is the maximum operating limit for your condenser. (e) Emission capture system... with Add-on Controls Option § 63.4966 How do I establish the emission capture system and add-on control...

  14. 40 CFR 63.4167 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... system and add-on control device operating limits during the performance test? 63.4167 Section 63.4167... with Add-on Controls Option § 63.4167 How do I establish the emission capture system and add-on control device operating limits during the performance test? During the performance test required by § 63.4160...

  15. 40 CFR 63.4167 - How do I establish the emission capture system and add-on control device operating limits during...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system and add-on control device operating limits during the performance test? 63.4167 Section 63.4167... with Add-on Controls Option § 63.4167 How do I establish the emission capture system and add-on control device operating limits during the performance test? During the performance test required by § 63.4160...

  16. 40 CFR 63.3169 - What are the requirements for a capture system or add-on control device which is not taken into...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... systems or add-on control devices which you choose not to take into account when demonstrating compliance... system or add-on control device which is not taken into account when demonstrating compliance with the....3169 What are the requirements for a capture system or add-on control device which is not taken into...

  17. Pricing and components analysis of some key essential pediatric medicine in Odisha state.

    PubMed

    Samal, Satyajit; Swain, Trupti Rekha

    2017-01-01

    Study highlighting prices, i.e., the patients actually pay at ground level is important for interventions such as alternate procurement schemes or to expedite regulatory assessment of essential medicines for children. The present study was undertaken to study pricing and component analysis of few key essential medicines in Odisha state. Six child-specific medicines of different formulations were selected based on use in different disease condition and having widest pricing variation. Data were collected, entered, and analyzed in the price components data collection form of the World Health Organization-Health Action International (WHO-HAI) 2007 Workbook version 5 - Part II provided as part of the WHO/HAI methodology. The analysis includes the cumulative percent markup, total cumulative percent markup, and percent contribution of individual components to the final medicine price in both public and private sector of Odisha state. Add-on costs such as taxes, wholesale, and retail markups contribute substantially to the final price of medicines in private sector, particularly for branded-generic products. The largest contributor to add-on costs is at the level of retailer shop. Policy should be framed to achieve a greater transparency and uniformity of the pricing of medicines at different health sectors of Odisha.

  18. Air concentrations of PBDEs on in-flight airplanes and assessment of flight crew inhalation exposure.

    PubMed

    Allen, Joseph G; Sumner, Ann Louise; Nishioka, Marcia G; Vallarino, Jose; Turner, Douglas J; Saltman, Hannah K; Spengler, John D

    2013-07-01

    To address the knowledge gaps regarding inhalation exposure of flight crew to polybrominated diphenyl ethers (PBDEs) on airplanes, we measured PBDE concentrations in air samples collected in the cabin air at cruising altitudes and used Bayesian Decision Analysis (BDA) to evaluate the likelihood of inhalation exposure to result in the average daily dose (ADD) of a member of the flight crew to exceed EPA Reference Doses (RfDs), accounting for all other aircraft and non-aircraft exposures. A total of 59 air samples were collected from different aircraft and analyzed for four PBDE congeners-BDE 47, 99, 100 and 209 (a subset were also analyzed for BDE 183). For congeners with a published RfD, high estimates of ADD were calculated for all non-aircraft exposure pathways and non-inhalation exposure onboard aircraft; inhalation exposure limits were then derived based on the difference between the RfD and ADDs for all other exposure pathways. The 95th percentile measured concentrations of PBDEs in aircraft air were <1% of the derived inhalation exposure limits. Likelihood probabilities of 95th percentile exposure concentrations >1% of the defined exposure limit were zero for all congeners with published RfDs.

  19. Depression requiring anti-depressant drug therapy in adult congenital heart disease: prevalence, risk factors, and prognostic value.

    PubMed

    Diller, Gerhard-Paul; Bräutigam, Andrea; Kempny, Aleksander; Uebing, Anselm; Alonso-Gonzalez, Rafael; Swan, Lorna; Babu-Narayan, Sonya V; Baumgartner, Helmut; Dimopoulos, Konstantinos; Gatzoulis, Michael A

    2016-03-01

    Depression is prevalent in adults with congenital heart disease (ACHD), but limited data on the frequency of anti-depressant drug (ADD) therapy and its impact on outcome are available. We identified all ACHD patients treated with ADDs between 2000 and 2011 at our centre. Of 6162 patients under follow-up, 204 (3.3%) patients were on ADD therapy. The majority of patients were treated with selective serotonin-reuptake inhibitors (67.4%), while only 17.0% of patients received tricyclic anti-depressants. Twice as many female patients used ADDs compared with males (4.4 vs. 2.2%, P < 0.0001). The percentage of patients on ADDs increased with disease complexity (P < 0.0001) and patient age (P < 0.0001). Over a median follow-up of 11.1 years, 507 (8.2%) patients died. After propensity score matching, ADD use was found to be significantly associated with worse outcome in male ACHD patients [hazard ratio 1.44 (95% confidence interval 1.17-1.84)]. There was no evidence that this excess mortality was directly related to ADD therapy, QT-prolongation, or malignant arrhythmias. However, males taking ADDs were also more likely to miss scheduled follow-up appointments compared with untreated counterparts, while no such difference in clinic attendance was seen in females. The use of ADD therapy in ACHD relates to gender, age, and disease complexity. Although, twice as many female patients were on ADDs, it were their male counterparts, who were at increased mortality risk on therapy. Furthermore, males on ADDs had worse adherence to scheduled appointments suggesting the need for special medical attention and possibly psychosocial intervention for this group of patients. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  20. Willingness To Pay for Information: An Analyst's Guide.

    ERIC Educational Resources Information Center

    Lee, Kyung Hee; Hatcher, Charles B.

    2001-01-01

    Compares methods for estimating consumer willingness to pay for information: contingent valuation, experimental auction, conjoint analysis, and hedonic price equations. Shows how, in the case of food dating, measurement of willingness is complicated by the question of whether the information adds to the product's value. (Contains 31 references.)…

Top