Sample records for model includes multiple

  1. Predicting Upwelling Radiance on the West Florida Shelf

    DTIC Science & Technology

    2006-03-31

    National Science Foundation . The chemical and biological model includes the ability to simulate multiple groups of phytoplankton, multiple limiting nutrients, spectral light harvesting by phytoplankton, multiple particulate and dissolved degradational pools of organic material, and non-stoichometric carbon, nitrogen, phosphorus, silica, and iron dynamics. It also includes a complete spectral light model for the prediction of Inherent Optical Properties (IOPs). The coupling of the predicted IOP model (Ecosim 2.0) with robust radiative transfer model (Ecolight

  2. Treatment of Fragile X Syndrome with a Neuroactive Steroid

    DTIC Science & Technology

    2015-08-01

    in the fragile X mouse model and the Drosophila (fruit fly) models of FXS that the GABAA system, including multiple receptors, is dramatically down... Drosophila (fruit fly) models of FXS that the GABAA system, including multiple receptors, is dramatically down-regulated. Ganaxolone is a drug that

  3. A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2009-01-01

    Cognitive or skills diagnosis models are discrete latent variable models developed specifically for the purpose of identifying the presence or absence of multiple fine-grained skills. However, applications of these models typically involve dichotomous or dichotomized data, including data from multiple-choice (MC) assessments that are scored as…

  4. A Multiple Deficit Model of Reading Disability and Attention-Deficit/Hyperactivity Disorder: Searching for Shared Cognitive Deficits

    ERIC Educational Resources Information Center

    McGrath, Lauren M.; Pennington, Bruce F.; Shanahan, Michelle A.; Santerre-Lemmon, Laura E.; Barnard, Holly D.; Willcutt, Erik G.; DeFries, John C.; Olson, Richard K.

    2011-01-01

    Background: This study tests a multiple cognitive deficit model of reading disability (RD), attention-deficit/hyperactivity disorder (ADHD), and their comorbidity. Methods: A structural equation model (SEM) of multiple cognitive risk factors and symptom outcome variables was constructed. The model included phonological awareness as a unique…

  5. Extension of the ADC Charge-Collection Model to Include Multiple Junctions

    NASA Technical Reports Server (NTRS)

    Edmonds, Larry D.

    2011-01-01

    The ADC model is a charge-collection model derived for simple p-n junction silicon diodes having a single reverse-biased p-n junction at one end and an ideal substrate contact at the other end. The present paper extends the model to include multiple junctions, and the goal is to estimate how collected charge is shared by the different junctions.

  6. Petri net modelling of buffers in automated manufacturing systems.

    PubMed

    Zhou, M; Dicesare, F

    1996-01-01

    This paper presents Petri net models of buffers and a methodology by which buffers can be included in a system without introducing deadlocks or overflows. The context is automated manufacturing. The buffers and models are classified as random order or order preserved (first-in-first-out or last-in-first-out), single-input-single-output or multiple-input-multiple-output, part type and/or space distinguishable or indistinguishable, and bounded or safe. Theoretical results for the development of Petri net models which include buffer modules are developed. This theory provides the conditions under which the system properties of boundedness, liveness, and reversibility are preserved. The results are illustrated through two manufacturing system examples: a multiple machine and multiple buffer production line and an automatic storage and retrieval system in the context of flexible manufacturing.

  7. Parallel Computing Using Web Servers and "Servlets".

    ERIC Educational Resources Information Center

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  8. Integral Methodological Pluralism in Science Education Research: Valuing Multiple Perspectives

    ERIC Educational Resources Information Center

    Davis, Nancy T.; Callihan, Laurie P.

    2013-01-01

    This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as…

  9. Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…

  10. Flexible Language Constructs for Large Parallel Programs

    DOE PAGES

    Rosing, Matt; Schnabel, Robert

    1994-01-01

    The goal of the research described in this article is to develop flexible language constructs for writing large data parallel numerical programs for distributed memory (multiple instruction multiple data [MIMD]) multiprocessors. Previously, several models have been developed to support synchronization and communication. Models for global synchronization include single instruction multiple data (SIMD), single program multiple data (SPMD), and sequential programs annotated with data distribution statements. The two primary models for communication include implicit communication based on shared memory and explicit communication based on messages. None of these models by themselves seem sufficient to permit the natural and efficient expression ofmore » the variety of algorithms that occur in large scientific computations. In this article, we give an overview of a new language that combines many of these programming models in a clean manner. This is done in a modular fashion such that different models can be combined to support large programs. Within a module, the selection of a model depends on the algorithm and its efficiency requirements. In this article, we give an overview of the language and discuss some of the critical implementation details.« less

  11. Hierarchical, parallel computing strategies using component object model for process modelling responses of forest plantations to interacting multiple stresses

    Treesearch

    J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech

    2000-01-01

    Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...

  12. Applications of active adaptive noise control to jet engines

    NASA Technical Reports Server (NTRS)

    Shoureshi, Rahmat; Brackney, Larry

    1993-01-01

    During phase 2 research on the application of active noise control to jet engines, the development of multiple-input/multiple-output (MIMO) active adaptive noise control algorithms and acoustic/controls models for turbofan engines were considered. Specific goals for this research phase included: (1) implementation of a MIMO adaptive minimum variance active noise controller; and (2) turbofan engine model development. A minimum variance control law for adaptive active noise control has been developed, simulated, and implemented for single-input/single-output (SISO) systems. Since acoustic systems tend to be distributed, multiple sensors, and actuators are more appropriate. As such, the SISO minimum variance controller was extended to the MIMO case. Simulation and experimental results are presented. A state-space model of a simplified gas turbine engine is developed using the bond graph technique. The model retains important system behavior, yet is of low enough order to be useful for controller design. Expansion of the model to include multiple stages and spools is also discussed.

  13. The Mediated MIMIC Model for Understanding the Underlying Mechanism of DIF

    ERIC Educational Resources Information Center

    Cheng, Ying; Shao, Can; Lathrop, Quinn N.

    2016-01-01

    Due to its flexibility, the multiple-indicator, multiple-causes (MIMIC) model has become an increasingly popular method for the detection of differential item functioning (DIF). In this article, we propose the mediated MIMIC model method to uncover the underlying mechanism of DIF. This method extends the usual MIMIC model by including one variable…

  14. Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…

  15. Behavioral Modeling of Adversaries with Multiple Objectives in Counterterrorism.

    PubMed

    Mazicioglu, Dogucan; Merrick, Jason R W

    2018-05-01

    Attacker/defender models have primarily assumed that each decisionmaker optimizes the cost of the damage inflicted and its economic repercussions from their own perspective. Two streams of recent research have sought to extend such models. One stream suggests that it is more realistic to consider attackers with multiple objectives, but this research has not included the adaption of the terrorist with multiple objectives to defender actions. The other stream builds off experimental studies that show that decisionmakers deviate from optimal rational behavior. In this article, we extend attacker/defender models to incorporate multiple objectives that a terrorist might consider in planning an attack. This includes the tradeoffs that a terrorist might consider and their adaption to defender actions. However, we must also consider experimental evidence of deviations from the rationality assumed in the commonly used expected utility model in determining such adaption. Thus, we model the attacker's behavior using multiattribute prospect theory to account for the attacker's multiple objectives and deviations from rationality. We evaluate our approach by considering an attacker with multiple objectives who wishes to smuggle radioactive material into the United States and a defender who has the option to implement a screening process to hinder the attacker. We discuss the problems with implementing such an approach, but argue that research in this area must continue to avoid misrepresenting terrorist behavior in determining optimal defensive actions. © 2017 Society for Risk Analysis.

  16. A geospatial modelling approach to predict seagrass habitat recovery under multiple stressor regimes

    EPA Science Inventory

    Restoration of estuarine seagrass habitats requires a clear understanding of the modes of action of multiple interacting stressors including nutrients, climate change, coastal land-use change, and habitat modification. We have developed and demonstrated a geospatial modeling a...

  17. The Multiple Component Alternative for Gifted Education.

    ERIC Educational Resources Information Center

    Swassing, Ray

    1984-01-01

    The Multiple Component Model (MCM) of gifted education includes instruction which may overlap in literature, history, art, enrichment, languages, science, physics, math, music, and dance. The model rests on multifactored identification and requires systematic development and selection of components with ongoing feedback and evaluation. (CL)

  18. Development of a Finite Element Model of the Human Shoulder to Investigate the Mechanical Responses and Injuries in Side Impact

    NASA Astrophysics Data System (ADS)

    Iwamoto, Masami; Miki, Kazuo; Yang, King H.

    Previous studies in both fields of automotive safety and orthopedic surgery have hypothesized that immobilization of the shoulder caused by the shoulder injury could be related to multiple rib fractures, which are frequently life threatening. Therefore, for more effective occupant protection, it is important to understand the relationship between shoulder injury and multiple rib fractures in side impact. The purpose of this study is to develop a finite element model of the human shoulder in order to understand this relationship. The shoulder model included three bones (the humerus, scapula and clavicle) and major ligaments and muscles around the shoulder. The model also included approaches to represent bone fractures and joint dislocations. The relationships between shoulder injury and immobilization of the shoulder are discussed using model responses for lateral shoulder impact. It is also discussed how the injury can be related to multiple rib fractures.

  19. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  20. Extending Data Worth Analyses to Select Multiple Observations Targeting Multiple Forecasts.

    PubMed

    Vilhelmsen, Troels N; Ferré, Ty P A

    2018-05-01

    Hydrological models are often set up to provide specific forecasts of interest. Owing to the inherent uncertainty in data used to derive model structure and used to constrain parameter variations, the model forecasts will be uncertain. Additional data collection is often performed to minimize this forecast uncertainty. Given our common financial restrictions, it is critical that we identify data with maximal information content with respect to forecast of interest. In practice, this often devolves to qualitative decisions based on expert opinion. However, there is no assurance that this will lead to optimal design, especially for complex hydrogeological problems. Specifically, these complexities include considerations of multiple forecasts, shared information among potential observations, information content of existing data, and the assumptions and simplifications underlying model construction. In the present study, we extend previous data worth analyses to include: simultaneous selection of multiple new measurements and consideration of multiple forecasts of interest. We show how the suggested approach can be used to optimize data collection. This can be used in a manner that suggests specific measurement sets or that produces probability maps indicating areas likely to be informative for specific forecasts. Moreover, we provide examples documenting that sequential measurement election approaches often lead to suboptimal designs and that estimates of data covariance should be included when selecting future measurement sets. © 2017, National Ground Water Association.

  1. System and method for optimal load and source scheduling in context aware homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shetty, Pradeep; Foslien Graber, Wendy; Mangsuli, Purnaprajna R.

    A controller for controlling energy consumption in a home includes a constraints engine to define variables for multiple appliances in the home corresponding to various home modes and persona of an occupant of the home. A modeling engine models multiple paths of energy utilization of the multiple appliances to place the home into a desired state from a current context. An optimal scheduler receives the multiple paths of energy utilization and generates a schedule as a function of the multiple paths and a selected persona to place the home in a desired state.

  2. Experimental models of demyelination and remyelination.

    PubMed

    Torre-Fuentes, L; Moreno-Jiménez, L; Pytel, V; Matías-Guiu, J A; Gómez-Pinedo, U; Matías-Guiu, J

    2017-08-29

    Experimental animal models constitute a useful tool to deepen our knowledge of central nervous system disorders. In the case of multiple sclerosis, however, there is no such specific model able to provide an overview of the disease; multiple models covering the different pathophysiological features of the disease are therefore necessary. We reviewed the different in vitro and in vivo experimental models used in multiple sclerosis research. Concerning in vitro models, we analysed cell cultures and slice models. As for in vivo models, we examined such models of autoimmunity and inflammation as experimental allergic encephalitis in different animals and virus-induced demyelinating diseases. Furthermore, we analysed models of demyelination and remyelination, including chemical lesions caused by cuprizone, lysolecithin, and ethidium bromide; zebrafish; and transgenic models. Experimental models provide a deeper understanding of the different pathogenic mechanisms involved in multiple sclerosis. Choosing one model or another depends on the specific aims of the study. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. A Model for Communications Satellite System Architecture Assessment

    DTIC Science & Technology

    2011-09-01

    This is shown in Equation 4. The total system cost includes all development, acquisition, fielding, operations, maintenance and upgrades, and system...protection. A mathematical model was implemented to enable the analysis of communications satellite system architectures based on multiple system... implemented to enable the analysis of communications satellite system architectures based on multiple system attributes. Utilization of the model in

  4. [Prediction model of health workforce and beds in county hospitals of Hunan by multiple linear regression].

    PubMed

    Ling, Ru; Liu, Jiawang

    2011-12-01

    To construct prediction model for health workforce and hospital beds in county hospitals of Hunan by multiple linear regression. We surveyed 16 counties in Hunan with stratified random sampling according to uniform questionnaires,and multiple linear regression analysis with 20 quotas selected by literature view was done. Independent variables in the multiple linear regression model on medical personnels in county hospitals included the counties' urban residents' income, crude death rate, medical beds, business occupancy, professional equipment value, the number of devices valued above 10 000 yuan, fixed assets, long-term debt, medical income, medical expenses, outpatient and emergency visits, hospital visits, actual available bed days, and utilization rate of hospital beds. Independent variables in the multiple linear regression model on county hospital beds included the the population of aged 65 and above in the counties, disposable income of urban residents, medical personnel of medical institutions in county area, business occupancy, the total value of professional equipment, fixed assets, long-term debt, medical income, medical expenses, outpatient and emergency visits, hospital visits, actual available bed days, utilization rate of hospital beds, and length of hospitalization. The prediction model shows good explanatory and fitting, and may be used for short- and mid-term forecasting.

  5. From bench to patient: model systems in drug discovery

    PubMed Central

    Breyer, Matthew D.; Look, A. Thomas; Cifra, Alessandra

    2015-01-01

    ABSTRACT Model systems, including laboratory animals, microorganisms, and cell- and tissue-based systems, are central to the discovery and development of new and better drugs for the treatment of human disease. In this issue, Disease Models & Mechanisms launches a Special Collection that illustrates the contribution of model systems to drug discovery and optimisation across multiple disease areas. This collection includes reviews, Editorials, interviews with leading scientists with a foot in both academia and industry, and original research articles reporting new and important insights into disease therapeutics. This Editorial provides a summary of the collection's current contents, highlighting the impact of multiple model systems in moving new discoveries from the laboratory bench to the patients' bedsides. PMID:26438689

  6. Design and modeling of sustainable bioethanol supply chain by minimizing the total ecological footprint in life cycle perspective.

    PubMed

    Ren, Jingzheng; Manzardo, Alessandro; Toniolo, Sara; Scipioni, Antonio; Tan, Shiyu; Dong, Lichun; Gao, Suzhao

    2013-10-01

    The purpose of this paper is to develop a model for designing the most sustainable bioethanol supply chain. Taking into consideration of the possibility of multiple-feedstock, multiple transportation modes, multiple alternative technologies, multiple transport patterns and multiple waste disposal manners in bioethanol systems, this study developed a model for designing the most sustainable bioethanol supply chain by minimizing the total ecological footprint under some prerequisite constraints including satisfying the goal of the stakeholders', the limitation of resources and energy, the capacity of warehouses, the market demand and some technological constraints. And an illustrative case of multiple-feedstock bioethanol system has been studied by the proposed method, and a global best solution by which the total ecological footprint is the minimal has been obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. A multiple deficit model of reading disability and attention-deficit/hyperactivity disorder: searching for shared cognitive deficits.

    PubMed

    McGrath, Lauren M; Pennington, Bruce F; Shanahan, Michelle A; Santerre-Lemmon, Laura E; Barnard, Holly D; Willcutt, Erik G; Defries, John C; Olson, Richard K

    2011-05-01

    This study tests a multiple cognitive deficit model of reading disability (RD), attention-deficit/hyperactivity disorder (ADHD), and their comorbidity. A structural equation model (SEM) of multiple cognitive risk factors and symptom outcome variables was constructed. The model included phonological awareness as a unique predictor of RD and response inhibition as a unique predictor of ADHD. Processing speed, naming speed, and verbal working memory were modeled as potential shared cognitive deficits. Model fit indices from the SEM indicated satisfactory fit. Closer inspection of the path weights revealed that processing speed was the only cognitive variable with significant unique relationships to RD and ADHD dimensions, particularly inattention. Moreover, the significant correlation between reading and inattention was reduced to non-significance when processing speed was included in the model, suggesting that processing speed primarily accounted for the phenotypic correlation (or comorbidity) between reading and inattention. This study illustrates the power of a multiple deficit approach to complex developmental disorders and psychopathologies, particularly for exploring comorbidities. The theoretical role of processing speed in the developmental pathways of RD and ADHD and directions for future research are discussed. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  8. Multiple imputation of covariates by fully conditional specification: Accommodating the substantive model

    PubMed Central

    Seaman, Shaun R; White, Ian R; Carpenter, James R

    2015-01-01

    Missing covariate data commonly occur in epidemiological and clinical research, and are often dealt with using multiple imputation. Imputation of partially observed covariates is complicated if the substantive model is non-linear (e.g. Cox proportional hazards model), or contains non-linear (e.g. squared) or interaction terms, and standard software implementations of multiple imputation may impute covariates from models that are incompatible with such substantive models. We show how imputation by fully conditional specification, a popular approach for performing multiple imputation, can be modified so that covariates are imputed from models which are compatible with the substantive model. We investigate through simulation the performance of this proposal, and compare it with existing approaches. Simulation results suggest our proposal gives consistent estimates for a range of common substantive models, including models which contain non-linear covariate effects or interactions, provided data are missing at random and the assumed imputation models are correctly specified and mutually compatible. Stata software implementing the approach is freely available. PMID:24525487

  9. Controlled Ecological Life Support System (CELSS) modeling

    NASA Technical Reports Server (NTRS)

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray

    1992-01-01

    Attention is given to CELSS, a critical technology for the Space Exploration Initiative. OCAM (object-oriented CELSS analysis and modeling) models carbon, hydrogen, and oxygen recycling. Multiple crops and plant types can be simulated. Resource recovery options from inedible biomass include leaching, enzyme treatment, aerobic digestion, and mushroom and fish growth. The benefit of using many small crops overlapping in time, instead of a single large crop, is demonstrated. Unanticipated results include startup transients which reduce the benefit of multiple small crops. The relative contributions of mass, energy, and manpower to system cost are analyzed in order to determine appropriate research directions.

  10. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  11. Using FUN3D for Aeroelastic, Sonic Boom, and AeroPropulsoServoElastic (APSE) Analyses of a Supersonic Configuration

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph; Kopasakis, George

    2016-01-01

    An overview of recent applications of the FUN3D CFD code to computational aeroelastic, sonic boom, and aeropropulsoservoelasticity (APSE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed including multiple unstructured CFD grids suitable for aeroelastic and sonic boom analyses. In addition, aeroelastic Reduced-Order Models (ROMs) are generated and used to rapidly compute the aeroelastic response and utter boundaries at multiple flight conditions.

  12. Sources of Variability in Physical Activity Among Inactive People with Multiple Sclerosis.

    PubMed

    Uszynski, Marcin K; Herring, Matthew P; Casey, Blathin; Hayes, Sara; Gallagher, Stephen; Motl, Robert W; Coote, Susan

    2018-04-01

    Evidence supports that physical activity (PA) improves symptoms of multiple sclerosis (MS). Although application of principles from Social Cognitive Theory (SCT) may facilitate positive changes in PA behaviour among people with multiple sclerosis (pwMS), the constructs often explain limited variance in PA. This study investigated the extent to which MS symptoms, including fatigue, depression, and walking limitations combined with the SCT constructs, explained more variance in PA than SCT constructs alone among pwMS. Baseline data, including objectively assessed PA, exercise self-efficacy, goal setting, outcome expectations, 6-min walk test, fatigue and depression, from 65 participants of the Step It Up randomized controlled trial completed in Ireland (2016), were included. Multiple regression models quantified variance explained in PA and independent associations of (1) SCT constructs, (2) symptoms and (3) SCT constructs and symptoms. Model 1 included exercise self-efficacy, exercise goal setting and multidimensional outcomes expectations for exercise and explained ~14% of the variance in PA (R 2 =0.144, p < 0.05). Model 2 included walking limitations, fatigue and depression and explained 20% of the variance in PA (R 2 =0.196, p < 0.01). Model 3 combined models 1 and 2 and explained variance increased to ~29% (R 2 =0.288; p<0.01). In Model 3, exercise self-efficacy (β=0.30, p < 0.05), walking limitations (β=0.32, p < 0.01), fatigue (β = -0.41, p < 0.01) and depression (β = 0.34, p < 0.05) were significantly and independently associated with PA. Findings suggest that relevant MS symptoms improved by PA, including fatigue, depression and walking limitations, and SCT constructs together explained more variance in PA than SCT constructs alone, providing support for targeting both SCT constructs and these symptoms in the multifactorial promotion of PA among pwMS.

  13. Estimating comparable English healthcare costs for multiple diseases and unrelated future costs for use in health and public health economic modelling.

    PubMed

    Briggs, Adam D M; Scarborough, Peter; Wolstenholme, Jane

    2018-01-01

    Healthcare interventions, and particularly those in public health may affect multiple diseases and significantly prolong life. No consensus currently exists for how to estimate comparable healthcare costs across multiple diseases for use in health and public health cost-effectiveness models. We aim to describe a method for estimating comparable disease specific English healthcare costs as well as future healthcare costs from diseases unrelated to those modelled. We use routine national datasets including programme budgeting data and cost curves from NHS England to estimate annual per person costs for diseases included in the PRIMEtime model as well as age and sex specific costs due to unrelated diseases. The 2013/14 annual cost to NHS England per prevalent case varied between £3,074 for pancreatic cancer and £314 for liver disease. Costs due to unrelated diseases increase with age except for a secondary peak at 30-34 years for women reflecting maternity resource use. The methodology described allows health and public health economic modellers to estimate comparable English healthcare costs for multiple diseases. This facilitates the direct comparison of different health and public health interventions enabling better decision making.

  14. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  15. From bench to patient: model systems in drug discovery.

    PubMed

    Breyer, Matthew D; Look, A Thomas; Cifra, Alessandra

    2015-10-01

    Model systems, including laboratory animals, microorganisms, and cell- and tissue-based systems, are central to the discovery and development of new and better drugs for the treatment of human disease. In this issue, Disease Models & Mechanisms launches a Special Collection that illustrates the contribution of model systems to drug discovery and optimisation across multiple disease areas. This collection includes reviews, Editorials, interviews with leading scientists with a foot in both academia and industry, and original research articles reporting new and important insights into disease therapeutics. This Editorial provides a summary of the collection's current contents, highlighting the impact of multiple model systems in moving new discoveries from the laboratory bench to the patients' bedsides. © 2015. Published by The Company of Biologists Ltd.

  16. An Advanced Multiple Alternatives Modeling Formulation for Determining Graduated Fiscal Support Strategies for Operational and Planned Educational Programs.

    ERIC Educational Resources Information Center

    Wholeben, Brent Edward

    A rationale is presented for viewing the decision-making process inherent in determining budget reductions for educational programs as most effectively modeled by a graduated funding approach. The major tenets of the graduated budget reduction approach to educational fiscal policy include the development of multiple alternative reduction plans, or…

  17. Model of visual contrast gain control and pattern masking

    NASA Technical Reports Server (NTRS)

    Watson, A. B.; Solomon, J. A.

    1997-01-01

    We have implemented a model of contrast gain and control in human vision that incorporates a number of key features, including a contrast sensitivity function, multiple oriented bandpass channels, accelerating nonlinearities, and a devisive inhibitory gain control pool. The parameters of this model have been optimized through a fit to the recent data that describe masking of a Gabor function by cosine and Gabor masks [J. M. Foley, "Human luminance pattern mechanisms: masking experiments require a new model," J. Opt. Soc. Am. A 11, 1710 (1994)]. The model achieves a good fit to the data. We also demonstrate how the concept of recruitment may accommodate a variant of this model in which excitatory and inhibitory paths have a common accelerating nonlinearity, but which include multiple channels tuned to different levels of contrast.

  18. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  19. NACP Synthesis: Evaluating modeled carbon state and flux variables against multiple observational constraints (Invited)

    NASA Astrophysics Data System (ADS)

    Thornton, P. E.; Nacp Site Synthesis Participants

    2010-12-01

    The North American Carbon Program (NACP) synthesis effort includes an extensive intercomparison of modeled and observed ecosystem states and fluxes preformed with multiple models across multiple sites. The participating models span a range of complexity and intended application, while the participating sites cover a broad range of natural and managed ecosystems in North America, from the subtropics to arctic tundra, and coastal to interior climates. A unique characteristic of this collaborative effort is that multiple independent observations are available at all sites: fluxes are measured with the eddy covariance technique, and standard biometric and field sampling methods provide estimates of standing stock and annual production in multiple categories. In addition, multiple modeling approaches are employed to make predictions at each site, varying, for example, in the use of diagnostic vs. prognostic leaf area index. Given multiple independent observational constraints and multiple classes of model, we evaluate the internal consistency of observations at each site, and use this information to extend previously derived estimates of uncertainty in the flux observations. Model results are then compared with all available observations and models are ranked according to their consistency with each type of observation (high frequency flux measurement, carbon stock, annual production). We demonstrate a range of internal consistency across the sites, and show that some models which perform well against one observational metric perform poorly against others. We use this analysis to construct a hypothesis for combining eddy covariance, biometrics, and other standard physiological and ecological measurements which, as data collection proceeded over several years, would present an increasingly challenging target for next generation models.

  20. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. The relational database model and multiple multicenter clinical trials.

    PubMed

    Blumenstein, B A

    1989-12-01

    The Southwest Oncology Group (SWOG) chose to use a relational database management system (RDBMS) for the management of data from multiple clinical trials because of the underlying relational model's inherent flexibility and the natural way multiple entity types (patients, studies, and participants) can be accommodated. The tradeoffs to using the relational model as compared to using the hierarchical model include added computing cycles due to deferred data linkages and added procedural complexity due to the necessity of implementing protections against referential integrity violations. The SWOG uses its RDBMS as a platform on which to build data operations software. This data operations software, which is written in a compiled computer language, allows multiple users to simultaneously update the database and is interactive with respect to the detection of conditions requiring action and the presentation of options for dealing with those conditions. The relational model facilitates the development and maintenance of data operations software.

  2. The North American Regional Climate Change Assessment Program (NARCCAP): Status and results

    NASA Astrophysics Data System (ADS)

    Arritt, R.

    2009-04-01

    NARCCAP is an international program that is generating projections of climate change for the U.S., Canada, and northern Mexico at decision-relevant regional scales. NARCCAP uses multiple limited-area regional climate models (RCMs) nested within multiple atmosphere-ocean general circulation models (AOGCMs). The use of multiple regional and global models allows us to investigate the uncertainty in model responses to future emissions (here, the A2 SRES scenario). The project also includes global time-slice experiments at the same discretization (50 km) using the GFDL atmospheric model (AM2.1) and the NCAR atmospheric model (CAM3). Phase I of the experiment uses the regional models nested within reanalysis in order to establish uncertainty attributable to the RCMs themselves. Phase II of the project then nests the RCMs within results from the current and future runs of the AOGCMs to explore the cascade of uncertainty from the global to the regional models. Phase I has been completed and the results to be shown include findings that spectral nudging is beneficial in some regions but not in others. Phase II is nearing completion and some preliminary results will be shown.

  3. Modeling a Single SEP Event from Multiple Vantage Points Using the iPATH Model

    NASA Astrophysics Data System (ADS)

    Hu, Junxiang; Li, Gang; Fu, Shuai; Zank, Gary; Ao, Xianzhi

    2018-02-01

    Using the recently extended 2D improved Particle Acceleration and Transport in the Heliosphere (iPATH) model, we model an example gradual solar energetic particle event as observed at multiple locations. Protons and ions that are energized via the diffusive shock acceleration mechanism are followed at a 2D coronal mass ejection-driven shock where the shock geometry varies across the shock front. The subsequent transport of energetic particles, including cross-field diffusion, is modeled by a Monte Carlo code that is based on a stochastic differential equation method. Time intensity profiles and particle spectra at multiple locations and different radial distances, separated in longitudes, are presented. The results shown here are relevant to the upcoming Parker Solar Probe mission.

  4. Multiple commodities in statistical microeconomics: Model and market

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Yu, Miao; Du, Xin

    2016-11-01

    A statistical generalization of microeconomics has been made in Baaquie (2013). In Baaquie et al. (2015), the market behavior of single commodities was analyzed and it was shown that market data provides strong support for the statistical microeconomic description of commodity prices. The case of multiple commodities is studied and a parsimonious generalization of the single commodity model is made for the multiple commodities case. Market data shows that the generalization can accurately model the simultaneous correlation functions of up to four commodities. To accurately model five or more commodities, further terms have to be included in the model. This study shows that the statistical microeconomics approach is a comprehensive and complete formulation of microeconomics, and which is independent to the mainstream formulation of microeconomics.

  5. A quantitative model of application slow-down in multi-resource shared systems

    DOE PAGES

    Lim, Seung-Hwan; Kim, Youngjae

    2016-12-26

    Scheduling multiple jobs onto a platform enhances system utilization by sharing resources. The benefits from higher resource utilization include reduced cost to construct, operate, and maintain a system, which often include energy consumption. Maximizing these benefits comes at a price-resource contention among jobs increases job completion time. In this study, we analyze slow-downs of jobs due to contention for multiple resources in a system; referred to as dilation factor. We observe that multiple-resource contention creates non-linear dilation factors of jobs. From this observation, we establish a general quantitative model for dilation factors of jobs in multi-resource systems. A job ismore » characterized by a vector-valued loading statistics and dilation factors of a job set are given by a quadratic function of their loading vectors. We demonstrate how to systematically characterize a job, maintain the data structure to calculate the dilation factor (loading matrix), and calculate the dilation factor of each job. We validate the accuracy of the model with multiple processes running on a native Linux server, virtualized servers, and with multiple MapReduce workloads co-scheduled in a cluster. Evaluation with measured data shows that the D-factor model has an error margin of less than 16%. We extended the D-factor model to capture the slow-down of applications when multiple identical resources exist such as multi-core environments and multi-disks environments. Finally, validation results of the extended D-factor model with HPC checkpoint applications on the parallel file systems show that D-factor accurately captures the slow down of concurrent applications in such environments.« less

  6. A quantitative model of application slow-down in multi-resource shared systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Kim, Youngjae

    Scheduling multiple jobs onto a platform enhances system utilization by sharing resources. The benefits from higher resource utilization include reduced cost to construct, operate, and maintain a system, which often include energy consumption. Maximizing these benefits comes at a price-resource contention among jobs increases job completion time. In this study, we analyze slow-downs of jobs due to contention for multiple resources in a system; referred to as dilation factor. We observe that multiple-resource contention creates non-linear dilation factors of jobs. From this observation, we establish a general quantitative model for dilation factors of jobs in multi-resource systems. A job ismore » characterized by a vector-valued loading statistics and dilation factors of a job set are given by a quadratic function of their loading vectors. We demonstrate how to systematically characterize a job, maintain the data structure to calculate the dilation factor (loading matrix), and calculate the dilation factor of each job. We validate the accuracy of the model with multiple processes running on a native Linux server, virtualized servers, and with multiple MapReduce workloads co-scheduled in a cluster. Evaluation with measured data shows that the D-factor model has an error margin of less than 16%. We extended the D-factor model to capture the slow-down of applications when multiple identical resources exist such as multi-core environments and multi-disks environments. Finally, validation results of the extended D-factor model with HPC checkpoint applications on the parallel file systems show that D-factor accurately captures the slow down of concurrent applications in such environments.« less

  7. Numerical implementation of multiple peeling theory and its application to spider web anchorages.

    PubMed

    Brely, Lucas; Bosia, Federico; Pugno, Nicola M

    2015-02-06

    Adhesion of spider web anchorages has been studied in recent years, including the specific functionalities achieved through different architectures. To better understand the delamination mechanisms of these and other biological or artificial fibrillar adhesives, and how their adhesion can be optimized, we develop a novel numerical model to simulate the multiple peeling of structures with arbitrary branching and adhesion angles, including complex architectures. The numerical model is based on a recently developed multiple peeling theory, which extends the energy-based single peeling theory of Kendall, and can be applied to arbitrarily complex structures. In particular, we numerically show that a multiple peeling problem can be treated as the superposition of single peeling configurations even for complex structures. Finally, we apply the developed numerical approach to study spider web anchorages, showing how their function is achieved through optimal geometrical configurations.

  8. Numerical implementation of multiple peeling theory and its application to spider web anchorages

    PubMed Central

    Brely, Lucas; Bosia, Federico; Pugno, Nicola M.

    2015-01-01

    Adhesion of spider web anchorages has been studied in recent years, including the specific functionalities achieved through different architectures. To better understand the delamination mechanisms of these and other biological or artificial fibrillar adhesives, and how their adhesion can be optimized, we develop a novel numerical model to simulate the multiple peeling of structures with arbitrary branching and adhesion angles, including complex architectures. The numerical model is based on a recently developed multiple peeling theory, which extends the energy-based single peeling theory of Kendall, and can be applied to arbitrarily complex structures. In particular, we numerically show that a multiple peeling problem can be treated as the superposition of single peeling configurations even for complex structures. Finally, we apply the developed numerical approach to study spider web anchorages, showing how their function is achieved through optimal geometrical configurations. PMID:25657835

  9. High resolution crustal image of South California Continental Borderland: Reverse time imaging including multiples

    NASA Astrophysics Data System (ADS)

    Bian, A.; Gantela, C.

    2014-12-01

    Strong multiples were observed in marine seismic data of Los Angeles Regional Seismic Experiment (LARSE).It is crucial to eliminate these multiples in conventional ray-based or one-way wave-equation based depth image methods. As long as multiples contain information of target zone along travelling path, it's possible to use them as signal, to improve the illumination coverage thus enhance the image quality of structural boundaries. Reverse time migration including multiples is a two-way wave-equation based prestack depth image method that uses both primaries and multiples to map structural boundaries. Several factors, including source wavelet, velocity model, back ground noise, data acquisition geometry and preprocessing workflow may influence the quality of image. The source wavelet is estimated from direct arrival of marine seismic data. Migration velocity model is derived from integrated model building workflow, and the sharp velocity interfaces near sea bottom needs to be preserved in order to generate multiples in the forward and backward propagation steps. The strong amplitude, low frequency marine back ground noise needs to be removed before the final imaging process. High resolution reverse time image sections of LARSE Lines 1 and Line 2 show five interfaces: depth of sea-bottom, base of sedimentary basins, top of Catalina Schist, a deep layer and a possible pluton boundary. Catalina Schist shows highs in the San Clemente ridge, Emery Knoll, Catalina Ridge, under Catalina Basin on both the lines, and a minor high under Avalon Knoll. The high of anticlinal fold in Line 1 is under the north edge of Emery Knoll and under the San Clemente fault zone. An area devoid of any reflection features are interpreted as sides of an igneous plume.

  10. A Cognitive System Model for Human/Automation Dynamics in Airspace Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Pisanich, Gregory; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. In order to support that cognitive function definition, we have extended the Man Machine Integrated Design and Analysis System (MIDAS) to include representation of multiple cognitive agents (both human operators and intelligent aiding systems) operating aircraft, airline operations centers and air traffic control centers in the evolving airspace. The demands of this application require representation of many intelligent agents sharing world-models, and coordinating action/intention with cooperative scheduling of goals and actions in a potentially unpredictable world of operations. The MIDAS operator models have undergone significant development in order to understand the requirements for operator aiding and the impact of that aiding in the complex nondeterminate system of national airspace operations. The operator model's structure has been modified to include attention functions, action priority, and situation assessment. The cognitive function model has been expanded to include working memory operations including retrieval from long-term store, interference, visual-motor and verbal articulatory loop functions, and time-based losses. The operator's activity structures have been developed to include prioritization and interruption of multiple parallel activities among multiple operators, to provide for anticipation (knowledge of the intention and action of remote operators), and to respond to failures of the system and other operators in the system in situation-specific paradigms. The model's internal representation has been be modified so that multiple, autonomous sets of equipment will function in a scenario as the single equipment sets do now. In order to support the analysis requirements with multiple items of equipment, it is necessary for equipment to access the state of other equipment objects at initialization time (a radar object may need to access the position and speed of aircraft in its area, for example), and as a function of perception and sensor system interaction. The model has been improved to include multiple world-states as a function of equipment am operator interaction. The model has been used -1o predict the impact of warning and alert zones in aircraft operation, and, more critic-ally, the interaction of flight-deck based warning mechanisms and air traffic controller action in response to ground-based conflict prediction and alerting systems. In this operation, two operating systems provide alerting to two autonomous, but linked sets of operators, whose view of the system and whose dynamics in response are radically different. System stability and operator action was predicted using the MIDAS model.

  11. Development of Physics and Control of Multiple Forcing Mechanisms for the Alaska Tsunami Forecast Model

    NASA Astrophysics Data System (ADS)

    Bahng, B.; Whitmore, P.; Macpherson, K. A.; Knight, W. R.

    2016-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes or other mechanisms in either the Pacific Ocean, Atlantic Ocean or Gulf of Mexico. At the U.S. National Tsunami Warning Center (NTWC), the use of the model has been mainly for tsunami pre-computation due to earthquakes. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. The model has also been used for tsunami hindcasting due to submarine landslides and due to atmospheric pressure jumps, but in a very case-specific and somewhat limited manner. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves approach coastal waters. The shallow-water wave physics is readily applicable to all of the above tsunamis as well as to tides. Recently, the model has been expanded to include multiple forcing mechanisms in a systematic fashion, and to enhance the model physics for non-earthquake events.ATFM is now able to handle multiple source mechanisms, either individually or jointly, which include earthquake, submarine landslide, meteo-tsunami and tidal forcing. As for earthquakes, the source can be a single unit source or multiple, interacting source blocks. Horizontal slip contribution can be added to the sea-floor displacement. The model now includes submarine landslide physics, modeling the source either as a rigid slump, or as a viscous fluid. Additional shallow-water physics have been implemented for the viscous submarine landslides. With rigid slumping, any trajectory can be followed. As for meteo-tsunami, the forcing mechanism is capable of following any trajectory shape. Wind stress physics has also been implemented for the meteo-tsunami case, if required. As an example of multiple sources, a near-field model of the tsunami produced by a combination of earthquake and submarine landslide forcing which happened in Papua New Guinea on July 17, 1998 is provided.

  12. PTSD's factor structure and measurement invariance across subgroups with differing count of trauma types.

    PubMed

    Contractor, Ateka A; Caldas, Stephanie V; Dolan, Megan; Lagdon, Susan; Armour, Chérie

    2018-06-01

    To investigate the effect of the count of traumatizing event (TE) types on post-trauma mental health, several studies have compared posttraumatic stress disorder (PTSD) severity between individuals experiencing one versus multiple TE types. However, the validity of these studies depends on the establishment of measurement invariance of the construct(s) of interest. The current study examined the stability of the most optimal PTSD Model symptom cluster constructs (assessed by the PTSD Checklist for DSM-5 [PCL-5]) across subgroups experiencing one versus multiple TE types. The sample included university students (n = 556) endorsing at least one TE (Stressful Life Events Screening Questionnaire). Using data from the entire sample, results suggest that the PCL-5-assessed Hybrid Model provided a significantly better fit compared to other models. Results also indicated invariance of factor loadings (metric), and intercepts (scalar) for the PCL-5-assessed Hybrid Model factors across subgroups endorsing one (n = 191) versus multiple TE types (n = 365). Our findings thus support the stability, applicability, and meaningful comparison of the PCL-assessed Hybrid Model factor structure (including subscale severity scores) across subgroups experiencing one versus multiple TE types. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Scalability Analysis and Use of Compression at the Goddard DAAC and End-to-End MODIS Transfers

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.

    1998-01-01

    The goal of this task is to analyze the performance of single and multiple FTP transfer between SCF's and the Goddard DAAC. We developed an analytic model to compute the performance of FTP sessions as a function of various key parameters, implemented the model as a program called FTP Analyzer, and carried out validations with real data obtained by running single and multiple FTP transfer between GSFC and the Miami SCF. The input parameters to the model include the mix to FTP sessions (scenario), and for each FTP session, the file size. The network parameters include the round trip time, packet loss rate, the limiting bandwidth of the network connecting the SCF to a DAAC, TCP's basic timeout, TCP's Maximum Segment Size, and TCP's Maximum Receiver's Window Size. The modeling approach used consisted of modeling TCP's overall throughput, computing TCP's delay per FTP transfer, and then solving a queuing network model that includes the FTP clients and servers.

  14. Documentation for the MODFLOW 6 framework

    USGS Publications Warehouse

    Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.

    2017-08-10

    MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.

  15. Model assessment using a multi-metric ranking technique

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  16. LLNL-G3Dv3: Global P wave tomography model for improved regional and teleseismic travel time prediction: LLNL-G3DV3---GLOBAL P WAVE TOMOGRAPHY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmons, N. A.; Myers, S. C.; Johannesson, G.

    [1] We develop a global-scale P wave velocity model (LLNL-G3Dv3) designed to accurately predict seismic travel times at regional and teleseismic distances simultaneously. The model provides a new image of Earth's interior, but the underlying practical purpose of the model is to provide enhanced seismic event location capabilities. The LLNL-G3Dv3 model is based on ∼2.8 millionP and Pnarrivals that are re-processed using our global multiple-event locator called Bayesloc. We construct LLNL-G3Dv3 within a spherical tessellation based framework, allowing for explicit representation of undulating and discontinuous layers including the crust and transition zone layers. Using a multiscale inversion technique, regional trendsmore » as well as fine details are captured where the data allow. LLNL-G3Dv3 exhibits large-scale structures including cratons and superplumes as well numerous complex details in the upper mantle including within the transition zone. Particularly, the model reveals new details of a vast network of subducted slabs trapped within the transition beneath much of Eurasia, including beneath the Tibetan Plateau. We demonstrate the impact of Bayesloc multiple-event location on the resulting tomographic images through comparison with images produced without the benefit of multiple-event constraints (single-event locations). We find that the multiple-event locations allow for better reconciliation of the large set of direct P phases recorded at 0–97° distance and yield a smoother and more continuous image relative to the single-event locations. Travel times predicted from a 3-D model are also found to be strongly influenced by the initial locations of the input data, even when an iterative inversion/relocation technique is employed.« less

  17. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  18. Mindfulness facets, trait emotional intelligence, emotional distress, and multiple health behaviors: A serial two-mediator model.

    PubMed

    Jacobs, Ingo; Wollny, Anna; Sim, Chu-Won; Horsch, Antje

    2016-06-01

    In the present study, we tested a serial mindfulness facets-trait emotional intelligence (TEI)-emotional distress-multiple health behaviors mediation model in a sample of N = 427 German-speaking occupational therapists. The mindfulness facets-TEI-emotional distress section of the mediation model revealed partial mediation for the mindfulness facets Act with awareness (Act/Aware) and Accept without judgment (Accept); inconsistent mediation was found for the Describe facet. The serial two-mediator model included three mediational pathways that may link each of the four mindfulness facets with multiple health behaviors. Eight out of 12 indirect effects reached significance and fully mediated the links between Act/Aware and Describe to multiple health behaviors; partial mediation was found for Accept. The mindfulness facet Observe was most relevant for multiple health behaviors, but its relation was not amenable to mediation. Implications of the findings will be discussed. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  19. Analysis of in vitro fertilization data with multiple outcomes using discrete time-to-event analysis

    PubMed Central

    Maity, Arnab; Williams, Paige; Ryan, Louise; Missmer, Stacey; Coull, Brent; Hauser, Russ

    2014-01-01

    In vitro fertilization (IVF) is an increasingly common method of assisted reproductive technology. Because of the careful observation and followup required as part of the procedure, IVF studies provide an ideal opportunity to identify and assess clinical and demographic factors along with environmental exposures that may impact successful reproduction. A major challenge in analyzing data from IVF studies is handling the complexity and multiplicity of outcome, resulting from both multiple opportunities for pregnancy loss within a single IVF cycle in addition to multiple IVF cycles. To date, most evaluations of IVF studies do not make use of full data due to its complex structure. In this paper, we develop statistical methodology for analysis of IVF data with multiple cycles and possibly multiple failure types observed for each individual. We develop a general analysis framework based on a generalized linear modeling formulation that allows implementation of various types of models including shared frailty models, failure specific frailty models, and transitional models, using standard software. We apply our methodology to data from an IVF study conducted at the Brigham and Women’s Hospital, Massachusetts. We also summarize the performance of our proposed methods based on a simulation study. PMID:24317880

  20. Multiple parton interactions and production of charged particles up to the intermediate-pT range in high-multiplicity p p events at the LHC

    NASA Astrophysics Data System (ADS)

    Kar, Somnath; Choudhury, Subikash; Muhuri, Sanjib; Ghosh, Premomoy

    2017-01-01

    Satisfactory description of data by hydrodynamics-motivated models, as has been reported recently by experimental collaborations at the LHC, confirm "collectivity" in high-multiplicity proton-proton (p p ) collisions. Notwithstanding this, a detailed study of high-multiplicity p p data in other approaches or models is essential for better understanding of the specific phenomenon. In this study, the focus is on a pQCD-inspired multiparton interaction (MPI) model, including a color reconnection (CR) scheme as implemented in the Monte Carlo code, PYTHIA8 tune 4C. The MPI with the color reconnection reproduces the dependence of the mean transverse momentum ⟨pT⟩ on the charged particle multiplicity Nch in p p collisions at the LHC, providing an alternate explanation to the signature of "hydrodynamic collectivity" in p p data. It is, therefore, worth exploring how this model responds to other related features of high-multiplicity p p events. This comparative study with recent experimental results demonstrates the limitations of the model in explaining some of the prominent features of the final-state charged particles up to the intermediate-pT (pT<2.0 GeV /c ) range in high-multiplicity p p events.

  1. The need and approach for characterization - U.S. air force perspectives on materials state awareness

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Lindgren, Eric A.

    2018-04-01

    This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.

  2. Modeling month-season of birth as a risk factor in mouse models of chronic disease: from multiple sclerosis to autoimmune encephalomyelitis.

    PubMed

    Reynolds, Jacob D; Case, Laure K; Krementsov, Dimitry N; Raza, Abbas; Bartiss, Rose; Teuscher, Cory

    2017-06-01

    Month-season of birth (M-SOB) is a risk factor in multiple chronic diseases, including multiple sclerosis (MS), where the lowest and greatest risk of developing MS coincide with the lowest and highest birth rates, respectively. To determine whether M-SOB effects in such chronic diseases as MS can be experimentally modeled, we examined the effect of M-SOB on susceptibility of C57BL/6J mice to experimental autoimmune encephalomyelitis (EAE). As in MS, mice that were born during the M-SOB with the lowest birth rate were less susceptible to EAE than mice born during the M-SOB with the highest birth rate. We also show that the M-SOB effect on EAE susceptibility is associated with differential production of multiple cytokines/chemokines by neuroantigen-specific T cells that are known to play a role in EAE pathogenesis. Taken together, these results support the existence of an M-SOB effect that may reflect seasonally dependent developmental differences in adaptive immune responses to self-antigens independent of external stimuli, including exposure to sunlight and vitamin D. Moreover, our documentation of an M-SOB effect on EAE susceptibility in mice allows for modeling and detailed analysis of mechanisms that underlie the M-SOB effect in not only MS but in numerous other diseases in which M-SOB impacts susceptibility.-Reynolds, J. D., Case, L. K., Krementsov, D. N., Raza, A., Bartiss, R., Teuscher, C. Modeling month-season of birth as a risk factor in mouse models of chronic disease: from multiple sclerosis to autoimmune encephalomyelitis. © FASEB.

  3. Information Retrieval: A Sequential Learning Process.

    ERIC Educational Resources Information Center

    Bookstein, Abraham

    1983-01-01

    Presents decision-theoretic models which intrinsically include retrieval of multiple documents whereby system responds to request by presenting documents to patron in sequence, gathering feedback, and using information to modify future retrievals. Document independence model, set retrieval model, sequential retrieval model, learning model,…

  4. Multisite EPR oximetry from multiple quadrature harmonics.

    PubMed

    Ahmad, R; Som, S; Johnson, D H; Zweier, J L; Kuppusamy, P; Potter, L C

    2012-01-01

    Multisite continuous wave (CW) electron paramagnetic resonance (EPR) oximetry using multiple quadrature field modulation harmonics is presented. First, a recently developed digital receiver is used to extract multiple harmonics of field modulated projection data. Second, a forward model is presented that relates the projection data to unknown parameters, including linewidth at each site. Third, a maximum likelihood estimator of unknown parameters is reported using an iterative algorithm capable of jointly processing multiple quadrature harmonics. The data modeling and processing are applicable for parametric lineshapes under nonsaturating conditions. Joint processing of multiple harmonics leads to 2-3-fold acceleration of EPR data acquisition. For demonstration in two spatial dimensions, both simulations and phantom studies on an L-band system are reported. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Modeling complex effects of multiple environmental stresses on carbon dynamics of Mid-Atlantic temperate forests

    Treesearch

    Yude Pan; Richard Birdsey; John Hom; Kevin McCullough

    2007-01-01

    We used our GIS variant of the PnET-CN model to investigate changes of forest carbon stocks and fluxes in Mid-Atlantic temperate forests over the last century (1900-2000). Forests in this region are affected by multiple environmental changes including climate, atmospheric CO2 concentration, N deposition and tropospheric ozone, and extensive land disturbances. Our...

  6. Inherited genetic variants associated with occurrence of multiple primary melanoma.

    PubMed

    Gibbs, David C; Orlow, Irene; Kanetsky, Peter A; Luo, Li; Kricker, Anne; Armstrong, Bruce K; Anton-Culver, Hoda; Gruber, Stephen B; Marrett, Loraine D; Gallagher, Richard P; Zanetti, Roberto; Rosso, Stefano; Dwyer, Terence; Sharma, Ajay; La Pilla, Emily; From, Lynn; Busam, Klaus J; Cust, Anne E; Ollila, David W; Begg, Colin B; Berwick, Marianne; Thomas, Nancy E

    2015-06-01

    Recent studies, including genome-wide association studies, have identified several putative low-penetrance susceptibility loci for melanoma. We sought to determine their generalizability to genetic predisposition for multiple primary melanoma in the international population-based Genes, Environment, and Melanoma (GEM) Study. GEM is a case-control study of 1,206 incident cases of multiple primary melanoma and 2,469 incident first primary melanoma participants as the control group. We investigated the odds of developing multiple primary melanoma for 47 SNPs from 21 distinct genetic regions previously reported to be associated with melanoma. ORs and 95% confidence intervals were determined using logistic regression models adjusted for baseline features (age, sex, age by sex interaction, and study center). We investigated univariable models and built multivariable models to assess independent effects of SNPs. Eleven SNPs in 6 gene neighborhoods (TERT/CLPTM1L, TYRP1, MTAP, TYR, NCOA6, and MX2) and a PARP1 haplotype were associated with multiple primary melanoma. In a multivariable model that included only the most statistically significant findings from univariable modeling and adjusted for pigmentary phenotype, back nevi, and baseline features, we found TERT/CLPTM1L rs401681 (P = 0.004), TYRP1 rs2733832 (P = 0.006), MTAP rs1335510 (P = 0.0005), TYR rs10830253 (P = 0.003), and MX2 rs45430 (P = 0.008) to be significantly associated with multiple primary melanoma, while NCOA6 rs4911442 approached significance (P = 0.06). The GEM Study provides additional evidence for the relevance of these genetic regions to melanoma risk and estimates the magnitude of the observed genetic effect on development of subsequent primary melanoma. ©2015 American Association for Cancer Research.

  7. Inherited genetic variants associated with occurrence of multiple primary melanoma

    PubMed Central

    Gibbs, David C.; Orlow, Irene; Kanetsky, Peter A.; Luo, Li; Kricker, Anne; Armstrong, Bruce K.; Anton-Culver, Hoda; Gruber, Stephen B.; Marrett, Loraine D.; Gallagher, Richard P.; Zanetti, Roberto; Rosso, Stefano; Dwyer, Terence; Sharma, Ajay; La Pilla, Emily; From, Lynn; Busam, Klaus J.; Cust, Anne E.; Ollila, David W.; Begg, Colin B.; Berwick, Marianne; Thomas, Nancy E.

    2015-01-01

    Recent studies including genome-wide association studies have identified several putative low-penetrance susceptibility loci for melanoma. We sought to determine their generalizability to genetic predisposition for multiple primary melanoma in the international population-based Genes, Environment, and Melanoma (GEM) Study. GEM is a case-control study of 1,206 incident cases of multiple primary melanoma and 2,469 incident first primary melanoma participants as the control group. We investigated the odds of developing multiple primary melanoma for 47 single nucleotide polymorphisms (SNP) from 21 distinct genetic regions previously reported to be associated with melanoma. ORs and 95% CIs were determined using logistic regression models adjusted for baseline features (age, sex, age by sex interaction, and study center). We investigated univariable models and built multivariable models to assess independent effects of SNPs. Eleven SNPs in 6 gene neighborhoods (TERT/CLPTM1L, TYRP1, MTAP, TYR, NCOA6, and MX2) and a PARP1 haplotype were associated with multiple primary melanoma. In a multivariable model that included only the most statistically significant findings from univariable modeling and adjusted for pigmentary phenotype, back nevi, and baseline features, we found TERT/CLPTM1L rs401681 (P = 0.004), TYRP1 rs2733832 (P = 0.006), MTAP rs1335510 (P = 0.0005), TYR rs10830253 (P = 0.003), and MX2 rs45430 (P = 0.008) to be significantly associated with multiple primary melanoma while NCOA6 rs4911442 approached significance (P = 0.06). The GEM study provides additional evidence for the relevance of these genetic regions to melanoma risk and estimates the magnitude of the observed genetic effect on development of subsequent primary melanoma. PMID:25837821

  8. Improving homology modeling of G-protein coupled receptors through multiple-template derived conserved inter-residue interactions

    NASA Astrophysics Data System (ADS)

    Chaudhari, Rajan; Heim, Andrew J.; Li, Zhijun

    2015-05-01

    Evidenced by the three-rounds of G-protein coupled receptors (GPCR) Dock competitions, improving homology modeling methods of helical transmembrane proteins including the GPCRs, based on templates of low sequence identity, remains an eminent challenge. Current approaches addressing this challenge adopt the philosophy of "modeling first, refinement next". In the present work, we developed an alternative modeling approach through the novel application of available multiple templates. First, conserved inter-residue interactions are derived from each additional template through conservation analysis of each template-target pairwise alignment. Then, these interactions are converted into distance restraints and incorporated in the homology modeling process. This approach was applied to modeling of the human β2 adrenergic receptor using the bovin rhodopsin and the human protease-activated receptor 1 as templates and improved model quality was demonstrated compared to the homology model generated by standard single-template and multiple-template methods. This method of "refined restraints first, modeling next", provides a fast and complementary way to the current modeling approaches. It allows rational identification and implementation of additional conserved distance restraints extracted from multiple templates and/or experimental data, and has the potential to be applicable to modeling of all helical transmembrane proteins.

  9. AgMIP Training in Multiple Crop Models and Tools

    NASA Technical Reports Server (NTRS)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  10. Fallon, Nevada FORGE Distinct Element Reservoir Modeling

    DOE Data Explorer

    Blankenship, Doug; Pettitt, Will; Riahi, Azadeh; Hazzard, Jim; Blanksma, Derrick

    2018-03-12

    Archive containing input/output data for distinct element reservoir modeling for Fallon FORGE. Models created using 3DEC, InSite, and in-house Python algorithms (ITASCA). List of archived files follows; please see 'Modeling Metadata.pdf' (included as a resource below) for additional file descriptions. Data sources include regional geochemical model, well positions and geometry, principal stress field, capability for hydraulic fractures, capability for hydro-shearing, reservoir geomechanical model-stimulation into multiple zones, modeled thermal behavior during circulation, and microseismicity.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yang; Liu, Zhiqiang, E-mail: lzq@semi.ac.cn, E-mail: spring@semi.ac.cn; Yi, Xiaoyan, E-mail: lzq@semi.ac.cn, E-mail: spring@semi.ac.cn

    To evaluate electron leakage in InGaN/GaN multiple quantum well (MQW) light emitting diodes (LEDs), analytic models of ballistic and quasi-ballistic transport are developed. With this model, the impact of critical variables effecting electron leakage, including the electron blocking layer (EBL), structure of multiple quantum wells (MQWs), polarization field, and temperature are explored. The simulated results based on this model shed light on previously reported experimental observations and provide basic criteria for suppressing electron leakage, advancing the design of InGaN/GaN LEDs.

  12. INTERDISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Synchronization in Complex Networks with Multiple Connections

    NASA Astrophysics Data System (ADS)

    Wu, Qing-Chu; Fu, Xin-Chu; Sun, Wei-Gang

    2010-01-01

    In this paper a class of networks with multiple connections are discussed. The multiple connections include two different types of links between nodes in complex networks. For this new model, we give a simple generating procedure. Furthermore, we investigate dynamical synchronization behavior in a delayed two-layer network, giving corresponding theoretical analysis and numerical examples.

  13. Multiple frequency bioelectrical impedance analysis: a cross-validation study of the inductor circuit and Cole models.

    PubMed

    Ward, L; Cornish, B H; Paton, N I; Thomas, B J

    1999-11-01

    It has been proposed that multiple frequency bioelectrical impedance models of the human body should include an inductive property for the circulatory system, the inductor circuit model (ICM), and that such a model, when coupled with a new method of data analysis, can improve the predictive power of multiple frequency bioelectrical impedance analysis (MFBIA). This hypothesis was tested using MFBIA measurements and gold standard measures of total body and extracellular water volumes in a cross-validation study in two subject groups (viz. controls and HIV). The MFBIA measurements were analysed using the current, widely accepted Cole model and the alternative ICM model which includes an inductive component. Correlations in the range 0.75 to 0.92 (for TBW) and 0.46 to 0.79 (for ECW) for impedance quotients versus gold standard measures within the subject groups were observed. These decreased, to as low as r = 0.50 for TBW and r = 0.29 for ECW, when the derived algorithms were applied to the alternative subject group. These results suggest that lack of portability of MFBIA algorithms between subject groups is not due to an inadequacy of the analogue circuit model per se but is possibly due more to fundamental flaws in the principles associated with its application. These include assuming a constant proportionality of body segment geometries and tissue fluid resistivities. This study has also demonstrated that this inadequacy cannot be overcome by simply introducing an inductive component into the analogue electrical circuit.

  14. Cloud E-Learning Service Strategies for Improving E-Learning Innovation Performance in a Fuzzy Environment by Using a New Hybrid Fuzzy Multiple Attribute Decision-Making Model

    ERIC Educational Resources Information Center

    Su, Chiu Hung; Tzeng, Gwo-Hshiung; Hu, Shu-Kung

    2016-01-01

    The purpose of this study was to address this problem by applying a new hybrid fuzzy multiple criteria decision-making model including (a) using the fuzzy decision-making trial and evaluation laboratory (DEMATEL) technique to construct the fuzzy scope influential network relationship map (FSINRM) and determine the fuzzy influential weights of the…

  15. Using Structural Equation Modeling To Fit Models Incorporating Principal Components.

    ERIC Educational Resources Information Center

    Dolan, Conor; Bechger, Timo; Molenaar, Peter

    1999-01-01

    Considers models incorporating principal components from the perspectives of structural-equation modeling. These models include the following: (1) the principal-component analysis of patterned matrices; (2) multiple analysis of variance based on principal components; and (3) multigroup principal-components analysis. Discusses fitting these models…

  16. Adaptive control of a jet turboshaft engine driving a variable pitch propeller using multiple models

    NASA Astrophysics Data System (ADS)

    Ahmadian, Narjes; Khosravi, Alireza; Sarhadi, Pouria

    2017-08-01

    In this paper, a multiple model adaptive control (MMAC) method is proposed for a gas turbine engine. The model of a twin spool turbo-shaft engine driving a variable pitch propeller includes various operating points. Variations in fuel flow and propeller pitch inputs produce different operating conditions which force the controller to be adopted rapidly. Important operating points are three idle, cruise and full thrust cases for the entire flight envelope. A multi-input multi-output (MIMO) version of second level adaptation using multiple models is developed. Also, stability analysis using Lyapunov method is presented. The proposed method is compared with two conventional first level adaptation and model reference adaptive control techniques. Simulation results for JetCat SPT5 turbo-shaft engine demonstrate the performance and fidelity of the proposed method.

  17. The Mediated MIMIC Model for Understanding the Underlying Mechanism of DIF.

    PubMed

    Cheng, Ying; Shao, Can; Lathrop, Quinn N

    2016-02-01

    Due to its flexibility, the multiple-indicator, multiple-causes (MIMIC) model has become an increasingly popular method for the detection of differential item functioning (DIF). In this article, we propose the mediated MIMIC model method to uncover the underlying mechanism of DIF. This method extends the usual MIMIC model by including one variable or multiple variables that may completely or partially mediate the DIF effect. If complete mediation effect is found, the DIF effect is fully accounted for. Through our simulation study, we find that the mediated MIMIC model is very successful in detecting the mediation effect that completely or partially accounts for DIF, while keeping the Type I error rate well controlled for both balanced and unbalanced sample sizes between focal and reference groups. Because it is successful in detecting such mediation effects, the mediated MIMIC model may help explain DIF and give guidance in the revision of a DIF item.

  18. Design of Xen Hybrid Multiple Police Model

    NASA Astrophysics Data System (ADS)

    Sun, Lei; Lin, Renhao; Zhu, Xianwei

    2017-10-01

    Virtualization Technology has attracted more and more attention. As a popular open-source virtualization tools, XEN is used more and more frequently. Xsm, XEN security model, has also been widespread concern. The safety status classification has not been established in the XSM, and it uses the virtual machine as a managed object to make Dom0 a unique administrative domain that does not meet the minimum privilege. According to these questions, we design a Hybrid multiple police model named SV_HMPMD that organically integrates multiple single security policy models include DTE,RBAC,BLP. It can fullfill the requirement of confidentiality and integrity for security model and use different particle size to different domain. In order to improve BLP’s practicability, the model introduce multi-level security labels. In order to divide the privilege in detail, we combine DTE with RBAC. In order to oversize privilege, we limit the privilege of domain0.

  19. The Mediated MIMIC Model for Understanding the Underlying Mechanism of DIF

    PubMed Central

    Cheng, Ying; Shao, Can; Lathrop, Quinn N.

    2015-01-01

    Due to its flexibility, the multiple-indicator, multiple-causes (MIMIC) model has become an increasingly popular method for the detection of differential item functioning (DIF). In this article, we propose the mediated MIMIC model method to uncover the underlying mechanism of DIF. This method extends the usual MIMIC model by including one variable or multiple variables that may completely or partially mediate the DIF effect. If complete mediation effect is found, the DIF effect is fully accounted for. Through our simulation study, we find that the mediated MIMIC model is very successful in detecting the mediation effect that completely or partially accounts for DIF, while keeping the Type I error rate well controlled for both balanced and unbalanced sample sizes between focal and reference groups. Because it is successful in detecting such mediation effects, the mediated MIMIC model may help explain DIF and give guidance in the revision of a DIF item.

  20. A Quasi-2D Delta-growth Model Accounting for Multiple Avulsion Events, Validated by Robust Data from the Yellow River Delta, China

    NASA Astrophysics Data System (ADS)

    Moodie, A. J.; Nittrouer, J. A.; Ma, H.; Carlson, B.; Parker, G.

    2016-12-01

    The autogenic "life cycle" of a lowland fluvial channel building a deltaic lobe typically follows a temporal sequence that includes: channel initiation, progradation and aggradation, and abandonment via avulsion. In terms of modeling these processes, it is possible to use a one-dimensional (1D) morphodynamic scheme to capture the magnitude of the prograding and aggrading processes. These models can include algorithms to predict the timing and location of avulsions for a channel lobe. However, this framework falls short in its ability to evaluate the deltaic system beyond the time scale of a single channel, and assess sedimentation processes occurring on the floodplain, which is important for lobe building. Herein, we adapt a 1D model to explicitly account for multiple avulsions and therefore replicate a deltaic system that includes many lobe cycles. Following an avulsion, sediment on the floodplain and beyond the radially-averaged shoreline is redistributed across the delta topset and along the shoreline, respectively, simultaneously prograding and aggrading the delta. Over time this framework produces net shoreline progradation and forward-stepping of subsequent avulsions. Testing this model using modern systems is inherently difficult due to a lack of data: most modern delta lobes are active for timescales of centuries to millennia, and so observing multiple iterations of the channel-lobe cycle is impossible. However, the Yellow River delta (China) is unique because the lobe cycles here occur within years to decades. Therefore it is possible to measure shoreline evolution through multiple lobe cycles, based on satellite imagery and historical records. These data are used to validate the model outcomes. Our findings confirm that the explicit accounting of avulsion processes in a quasi-2D model framework is capable of capturing shoreline development patterns that otherwise are not resolvable based on previously published delta building models.

  1. A robust and flexible Geospatial Modeling Interface (GMI) for environmental model deployment and evaluation

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the GMI (Geospatial Modeling Interface) simulation framework for environmental model deployment and assessment. GMI currently provides access to multiple environmental models including AgroEcoSystem-Watershed (AgES-W), Nitrate Leaching and Economic Analysis 2 (NLEA...

  2. Calculation of individual isotope equilibrium constants for implementation in geochemical models

    USGS Publications Warehouse

    Thorstenson, Donald C.; Parkhurst, David L.

    2002-01-01

    Theory is derived from the work of Urey to calculate equilibrium constants commonly used in geochemical equilibrium and reaction-transport models for reactions of individual isotopic species. Urey showed that equilibrium constants of isotope exchange reactions for molecules that contain two or more atoms of the same element in equivalent positions are related to isotope fractionation factors by , where is n the number of atoms exchanged. This relation is extended to include species containing multiple isotopes, for example and , and to include the effects of nonideality. The equilibrium constants of the isotope exchange reactions provide a basis for calculating the individual isotope equilibrium constants for the geochemical modeling reactions. The temperature dependence of the individual isotope equilibrium constants can be calculated from the temperature dependence of the fractionation factors. Equilibrium constants are calculated for all species that can be formed from and selected species containing , in the molecules and the ion pairs with where the subscripts g, aq, l, and s refer to gas, aqueous, liquid, and solid, respectively. These equilibrium constants are used in the geochemical model PHREEQC to produce an equilibrium and reaction-transport model that includes these isotopic species. Methods are presented for calculation of the individual isotope equilibrium constants for the asymmetric bicarbonate ion. An example calculates the equilibrium of multiple isotopes among multiple species and phases.

  3. Multiple player tracking in sports video: a dual-mode two-way bayesian inference approach with progressive observation modeling.

    PubMed

    Xing, Junliang; Ai, Haizhou; Liu, Liwei; Lao, Shihong

    2011-06-01

    Multiple object tracking (MOT) is a very challenging task yet of fundamental importance for many practical applications. In this paper, we focus on the problem of tracking multiple players in sports video which is even more difficult due to the abrupt movements of players and their complex interactions. To handle the difficulties in this problem, we present a new MOT algorithm which contributes both in the observation modeling level and in the tracking strategy level. For the observation modeling, we develop a progressive observation modeling process that is able to provide strong tracking observations and greatly facilitate the tracking task. For the tracking strategy, we propose a dual-mode two-way Bayesian inference approach which dynamically switches between an offline general model and an online dedicated model to deal with single isolated object tracking and multiple occluded object tracking integrally by forward filtering and backward smoothing. Extensive experiments on different kinds of sports videos, including football, basketball, as well as hockey, demonstrate the effectiveness and efficiency of the proposed method.

  4. Performance Dependences of Multiplication Layer Thickness for InP/InGaAs Avalanche Photodiodes Based on Time Domain Modeling

    NASA Technical Reports Server (NTRS)

    Xiao, Yegao; Bhat, Ishwara; Abedin, M. Nurul

    2005-01-01

    InP/InGaAs avalanche photodiodes (APDs) are being widely utilized in optical receivers for modern long haul and high bit-rate optical fiber communication systems. The separate absorption, grading, charge, and multiplication (SAGCM) structure is an important design consideration for APDs with high performance characteristics. Time domain modeling techniques have been previously developed to provide better understanding and optimize design issues by saving time and cost for the APD research and development. In this work, performance dependences on multiplication layer thickness have been investigated by time domain modeling. These performance characteristics include breakdown field and breakdown voltage, multiplication gain, excess noise factor, frequency response and bandwidth etc. The simulations are performed versus various multiplication layer thicknesses with certain fixed values for the areal charge sheet density whereas the values for the other structure and material parameters are kept unchanged. The frequency response is obtained from the impulse response by fast Fourier transformation. The modeling results are presented and discussed, and design considerations, especially for high speed operation at 10 Gbit/s, are further analyzed.

  5. Computer simulation of a multiple-aperture coherent laser radar

    NASA Astrophysics Data System (ADS)

    Gamble, Kevin J.; Weeks, Arthur R.

    1996-06-01

    This paper presents the construction of a 2D multiple aperture coherent laser radar simulation that is capable of including the effects of the time evolution of speckle on the laser radar output. Every portion of a laser radar system is modeled in software, including quarter and half wave plates, beamsplitters (polarizing and non-polarizing), the detector, the laser source, and all necessary lenses. Free space propagation is implemented using the Rayleigh- Sommerfeld integral for both orthogonal polarizations. Atmospheric turbulence is also included in the simulation and is modeled using time correlated Kolmogorov phase screens. The simulation itself can be configured to simulate both monostatic and bistatic systems. The simulation allows the user to specify component level parameters such as extinction ratios for polarizing beam splitters, detector sizes and shapes. orientation of the slow axis for quarter/half wave plates and other components used in the system. This is useful from a standpoint of being a tool in the design of a multiple aperture laser radar system.

  6. Propensity score analysis with partially observed covariates: How should multiple imputation be used?

    PubMed

    Leyrat, Clémence; Seaman, Shaun R; White, Ian R; Douglas, Ian; Smeeth, Liam; Kim, Joseph; Resche-Rigon, Matthieu; Carpenter, James R; Williamson, Elizabeth J

    2017-01-01

    Inverse probability of treatment weighting is a popular propensity score-based approach to estimate marginal treatment effects in observational studies at risk of confounding bias. A major issue when estimating the propensity score is the presence of partially observed covariates. Multiple imputation is a natural approach to handle missing data on covariates: covariates are imputed and a propensity score analysis is performed in each imputed dataset to estimate the treatment effect. The treatment effect estimates from each imputed dataset are then combined to obtain an overall estimate. We call this method MIte. However, an alternative approach has been proposed, in which the propensity scores are combined across the imputed datasets (MIps). Therefore, there are remaining uncertainties about how to implement multiple imputation for propensity score analysis: (a) should we apply Rubin's rules to the inverse probability of treatment weighting treatment effect estimates or to the propensity score estimates themselves? (b) does the outcome have to be included in the imputation model? (c) how should we estimate the variance of the inverse probability of treatment weighting estimator after multiple imputation? We studied the consistency and balancing properties of the MIte and MIps estimators and performed a simulation study to empirically assess their performance for the analysis of a binary outcome. We also compared the performance of these methods to complete case analysis and the missingness pattern approach, which uses a different propensity score model for each pattern of missingness, and a third multiple imputation approach in which the propensity score parameters are combined rather than the propensity scores themselves (MIpar). Under a missing at random mechanism, complete case and missingness pattern analyses were biased in most cases for estimating the marginal treatment effect, whereas multiple imputation approaches were approximately unbiased as long as the outcome was included in the imputation model. Only MIte was unbiased in all the studied scenarios and Rubin's rules provided good variance estimates for MIte. The propensity score estimated in the MIte approach showed good balancing properties. In conclusion, when using multiple imputation in the inverse probability of treatment weighting context, MIte with the outcome included in the imputation model is the preferred approach.

  7. The MAX Statistic is Less Powerful for Genome Wide Association Studies Under Most Alternative Hypotheses.

    PubMed

    Shifflett, Benjamin; Huang, Rong; Edland, Steven D

    2017-01-01

    Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.

  8. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  9. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  10. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  11. Simulation of Attitude and Trajectory Dynamics and Control of Multiple Spacecraft

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric T.

    2009-01-01

    Agora software is a simulation of spacecraft attitude and orbit dynamics. It supports spacecraft models composed of multiple rigid bodies or flexible structural models. Agora simulates multiple spacecraft simultaneously, supporting rendezvous, proximity operations, and precision formation flying studies. The Agora environment includes ephemerides for all planets and major moons in the solar system, supporting design studies for deep space as well as geocentric missions. The environment also contains standard models for gravity, atmospheric density, and magnetic fields. Disturbance force and torque models include aerodynamic, gravity-gradient, solar radiation pressure, and third-body gravitation. In addition to the dynamic and environmental models, Agora supports geometrical visualization through an OpenGL interface. Prototype models are provided for common sensors, actuators, and control laws. A clean interface accommodates linking in actual flight code in place of the prototype control laws. The same simulation may be used for rapid feasibility studies, and then used for flight software validation as the design matures. Agora is open-source and portable across computing platforms, making it customizable and extensible. It is written to support the entire GNC (guidance, navigation, and control) design cycle, from rapid prototyping and design analysis, to high-fidelity flight code verification. As a top-down design, Agora is intended to accommodate a large range of missions, anywhere in the solar system. Both two-body and three-body flight regimes are supported, as well as seamless transition between them. Multiple spacecraft may be simultaneously simulated, enabling simulation of rendezvous scenarios, as well as formation flying. Built-in reference frames and orbit perturbation dynamics provide accurate modeling of precision formation control.

  12. PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

    PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors permore » realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.« less

  13. Induced subgraph searching for geometric model fitting

    NASA Astrophysics Data System (ADS)

    Xiao, Fan; Xiao, Guobao; Yan, Yan; Wang, Xing; Wang, Hanzi

    2017-11-01

    In this paper, we propose a novel model fitting method based on graphs to fit and segment multiple-structure data. In the graph constructed on data, each model instance is represented as an induced subgraph. Following the idea of pursuing the maximum consensus, the multiple geometric model fitting problem is formulated as searching for a set of induced subgraphs including the maximum union set of vertices. After the generation and refinement of the induced subgraphs that represent the model hypotheses, the searching process is conducted on the "qualified" subgraphs. Multiple model instances can be simultaneously estimated by solving a converted problem. Then, we introduce the energy evaluation function to determine the number of model instances in data. The proposed method is able to effectively estimate the number and the parameters of model instances in data severely corrupted by outliers and noises. Experimental results on synthetic data and real images validate the favorable performance of the proposed method compared with several state-of-the-art fitting methods.

  14. Hidden Markov models of biological primary sequence information.

    PubMed Central

    Baldi, P; Chauvin, Y; Hunkapiller, T; McClure, M A

    1994-01-01

    Hidden Markov model (HMM) techniques are used to model families of biological sequences. A smooth and convergent algorithm is introduced to iteratively adapt the transition and emission parameters of the models from the examples in a given family. The HMM approach is applied to three protein families: globins, immunoglobulins, and kinases. In all cases, the models derived capture the important statistical characteristics of the family and can be used for a number of tasks, including multiple alignments, motif detection, and classification. For K sequences of average length N, this approach yields an effective multiple-alignment algorithm which requires O(KN2) operations, linear in the number of sequences. PMID:8302831

  15. ACIRF user's guide: Theory and examples

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.

    1989-12-01

    Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.

  16. Flexible language constructs for large parallel programs

    NASA Technical Reports Server (NTRS)

    Rosing, Matthew; Schnabel, Robert

    1993-01-01

    The goal of the research described is to develop flexible language constructs for writing large data parallel numerical programs for distributed memory (MIMD) multiprocessors. Previously, several models have been developed to support synchronization and communication. Models for global synchronization include SIMD (Single Instruction Multiple Data), SPMD (Single Program Multiple Data), and sequential programs annotated with data distribution statements. The two primary models for communication include implicit communication based on shared memory and explicit communication based on messages. None of these models by themselves seem sufficient to permit the natural and efficient expression of the variety of algorithms that occur in large scientific computations. An overview of a new language that combines many of these programming models in a clean manner is given. This is done in a modular fashion such that different models can be combined to support large programs. Within a module, the selection of a model depends on the algorithm and its efficiency requirements. An overview of the language and discussion of some of the critical implementation details is given.

  17. Exploring Contextual Models in Chemical Patent Search

    NASA Astrophysics Data System (ADS)

    Urbain, Jay; Frieder, Ophir

    We explore the development of probabilistic retrieval models for integrating term statistics with entity search using multiple levels of document context to improve the performance of chemical patent search. A distributed indexing model was developed to enable efficient named entity search and aggregation of term statistics at multiple levels of patent structure including individual words, sentences, claims, descriptions, abstracts, and titles. The system can be scaled to an arbitrary number of compute instances in a cloud computing environment to support concurrent indexing and query processing operations on large patent collections.

  18. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    ERIC Educational Resources Information Center

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  19. Advanced Multiple Processor Configuration Study. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…

  20. Monitoring and Modeling Performance of Communications in Computational Grids

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael A.; Le, Thuy T.

    2003-01-01

    Computational grids may include many machines located in a number of sites. For efficient use of the grid we need to have an ability to estimate the time it takes to communicate data between the machines. For dynamic distributed grids it is unrealistic to know exact parameters of the communication hardware and the current communication traffic and we should rely on a model of the network performance to estimate the message delivery time. Our approach to a construction of such a model is based on observation of the messages delivery time with various message sizes and time scales. We record these observations in a database and use them to build a model of the message delivery time. Our experiments show presence of multiple bands in the logarithm of the message delivery times. These multiple bands represent multiple paths messages travel between the grid machines and are incorporated in our multiband model.

  1. Wafer hotspot prevention using etch aware OPC correction

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Power, Dave; Salama, Mohamed; Chen, Ao

    2016-03-01

    As technology development advances into deep-sub-wavelength nodes, multiple patterning is becoming more essential to achieve the technology shrink requirements. Recently, Optical Proximity Correction (OPC) technology has proposed simultaneous correction of multiple mask-patterns to enable multiple patterning awareness during OPC correction. This is essential to prevent inter-layer hot-spots during the final pattern transfer. In state-of-art literature, multi-layer awareness is achieved using simultaneous resist-contour simulations to predict and correct for hot-spots during mask generation. However, this approach assumes a uniform etch shrink response for all patterns independent of their proximity, which isn't sufficient for the full prevention of inter-exposure hot-spot, for example different color space violations post etch or via coverage/enclosure post etch. In this paper, we explain the need to include the etch component during multiple patterning OPC. We also introduce a novel approach for Etch-aware simultaneous Multiple-patterning OPC, where we calibrate and verify a lumped model that includes the combined resist and etch responses. Adding this extra simulation condition during OPC is suitable for full chip processing from a computation intensity point of view. Also, using this model during OPC to predict and correct inter-exposures hot-spots is similar to previously proposed multiple-patterning OPC, yet our proposed approach more accurately corrects post-etch defects too.

  2. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  3. XML Encoding of Features Describing Rule-Based Modeling of Reaction Networks with Multi-Component Molecular Complexes

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2011-01-01

    Multi-state molecules and multi-component complexes are commonly involved in cellular signaling. Accounting for molecules that have multiple potential states, such as a protein that may be phosphorylated on multiple residues, and molecules that combine to form heterogeneous complexes located among multiple compartments, generates an effect of combinatorial complexity. Models involving relatively few signaling molecules can include thousands of distinct chemical species. Several software tools (StochSim, BioNetGen) are already available to deal with combinatorial complexity. Such tools need information standards if models are to be shared, jointly evaluated and developed. Here we discuss XML conventions that can be adopted for modeling biochemical reaction networks described by user-specified reaction rules. These could form a basis for possible future extensions of the Systems Biology Markup Language (SBML). PMID:21464833

  4. Quality of life among people with multiple sclerosis: Replication of a three-factor prediction model.

    PubMed

    Bishop, Malachy; Rumrill, Phillip D; Roessler, Richard T

    2015-01-01

    This article presents a replication of Rumrill, Roessler, and Fitzgerald's 2004 analysis of a three-factor model of the impact of multiple sclerosis (MS) on quality of life (QOL). The three factors in the original model included illness-related, employment-related, and psychosocial adjustment factors. To test hypothesized relationships between QOL and illness-related, employment-related, and psychosocial variables using data from a survey of the employment concerns of Americans with MS (N = 1,839). An ex post facto, multiple correlational design was employed incorporating correlational and multiple regression analyses. QOL was positively related to educational level, employment status, job satisfaction, and job-match, and negatively related to number of symptoms, severity of symptoms, and perceived stress level. The three-factor model explained approximately 37 percent of the variance in QOL scores. The results of this replication confirm the continuing value of the three-factor model for predicting the QOL of adults with MS, and demonstrate the importance of medical, mental health, and vocational rehabilitation interventions and services in promoting QOL.

  5. In vivo diagnosis of skin cancer using polarized and multiple scattered light spectroscopy

    NASA Astrophysics Data System (ADS)

    Bartlett, Matthew Allen

    This thesis research presents the development of a non-invasive diagnostic technique for distinguishing between skin cancer, moles, and normal skin using polarized and multiple scattered light spectroscopy. Polarized light incident on the skin is single scattered by the epidermal layer and multiple scattered by the dermal layer. The epidermal light maintains its initial polarization while the light from the dermal layer becomes randomized and multiple scattered. Mie theory was used to model the epidermal light as the scattering from the intercellular organelles. The dermal signal was modeled as the diffusion of light through a localized semi-homogeneous volume. These models were confirmed using skin phantom experiments, studied with in vitro cell cultures, and applied to human skin for in vivo testing. A CCD-based spectroscopy system was developed to perform all these experiments. The probe and the theory were tested on skin phantoms of latex spheres on top of a solid phantom. We next extended our phantom study to include in vitro cells on top of the solid phantom. Optical fluorescent microscope images revealed at least four distinct scatterers including mitochondria, nucleoli, nuclei, and cell membranes. Single scattering measurements on the mammalian cells consistently produced PSD's in the size range of the mitochondria. The clinical portion of the study consisted of in vivo measurements on cancer, mole, and normal skin spots. The clinical study combined the single scattering model from the phantom and in vitro cell studies with the diffusion model for multiple scattered light. When parameters from both layers were combined, we found that a sensitivity of 100% and 77% can be obtained for detecting cancers and moles, respectively, given the number of lesions examined.

  6. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  7. CHALLENGES OF PROCESSING BIOLOGICAL DATA FOR INCORPORATION INTO A LAKE EUTROPHICATION MODEL

    EPA Science Inventory

    A eutrophication model is in development as part of the Lake Michigan Mass Balance Project (LMMBP). Successful development and calibration of this model required the processing and incorporation of extensive biological data. Data were drawn from multiple sources, including nutrie...

  8. Project management tool

    NASA Technical Reports Server (NTRS)

    Maluf, David A. (Inventor); Bell, David G. (Inventor); Gurram, Mohana M. (Inventor); Gawdiak, Yuri O. (Inventor)

    2009-01-01

    A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as a monthly report, a task plan report, a budget report and a risk management report, are generated and made available for display or further analysis. An extensible database allows searching for information based upon context and upon content.

  9. Reducing the net torque and flow ripple effects of multiple hydraulic piston motor drives

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1992-01-01

    The torque and flow ripple effects which result when multiple hydraulic motors are used to drive a single motion of a mechanical device can significantly affect the way in which the device performs. This article presents a mathematical model describing the torque and flow ripple effects of a bent-axis hydraulic piston motor. The model is used to show how the ripple magnitude can be reduced when multiple motors are used to drive a motion. A discussion of the hydraulic servo system of the 70-m antennas located with the Deep Space Network is included to demonstrate the application of the concepts presented.

  10. Airport Noise Prediction Model -- MOD 7

    DOT National Transportation Integrated Search

    1978-07-01

    The MOD 7 Airport Noise Prediction Model is fully operational. The language used is Fortran, and it has been run on several different computer systems. Its capabilities include prediction of noise levels for single parameter changes, for multiple cha...

  11. Theoretical models for application in school health education research.

    PubMed

    Parcel, G S

    1984-01-01

    Theoretical models that may be useful to research studies in school health education are reviewed. Selected, well-defined theories include social learning theory, problem-behavior theory, theory of reasoned action, communications theory, coping theory, social competence, and social and family theories. Also reviewed are multiple theory models including models of health related-behavior, the PRECEDE Framework, social-psychological approaches and the Activated Health Education Model. Two major reviews of teaching models are also discussed. The paper concludes with a brief outline of the general applications of theory to the field of school health education including applications to basic research, development and design of interventions, program evaluation, and program utilization.

  12. A sample theory-based logic model to improve program development, implementation, and sustainability of Farm to School programs.

    PubMed

    Ratcliffe, Michelle M

    2012-08-01

    Farm to School programs hold promise to address childhood obesity. These programs may increase students’ access to healthier foods, increase students’ knowledge of and desire to eat these foods, and increase their consumption of them. Implementing Farm to School programs requires the involvement of multiple people, including nutrition services, educators, and food producers. Because these groups have not traditionally worked together and each has different goals, it is important to demonstrate how Farm to School programs that are designed to decrease childhood obesity may also address others’ objectives, such as academic achievement and economic development. A logic model is an effective tool to help articulate a shared vision for how Farm to School programs may work to accomplish multiple goals. Furthermore, there is evidence that programs based on theory are more likely to be effective at changing individuals’ behaviors. Logic models based on theory may help to explain how a program works, aid in efficient and sustained implementation, and support the development of a coherent evaluation plan. This article presents a sample theory-based logic model for Farm to School programs. The presented logic model is informed by the polytheoretical model for food and garden-based education in school settings (PMFGBE). The logic model has been applied to multiple settings, including Farm to School program development and evaluation in urban and rural school districts. This article also includes a brief discussion on the development of the PMFGBE, a detailed explanation of how Farm to School programs may enhance the curricular, physical, and social learning environments of schools, and suggestions for the applicability of the logic model for practitioners, researchers, and policy makers.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Bipasha; Davies, C. T. H.; de Oliveira, P. G.

    We determine the contribution to the anomalous magnetic moment of the muon from themore » $$\\alpha^2_{\\mathrm{QED}}$$ hadronic vacuum polarization diagram using full lattice QCD and including $u/d$ quarks with physical masses for the first time. We use gluon field configurations that include $u$, $d$, $s$ and $c$ quarks in the sea at multiple values of the lattice spacing, multiple $u/d$ masses and multiple volumes that allow us to include an analysis of finite-volume effects. We obtain a result for $$a_{\\mu}^{\\mathrm{HVP,LO}}$$ of $667(6)(12)$, where the first error is from the lattice calculation and the second includes systematic errors from missing QED and isospin-breaking effects and from quark-line disconnected diagrams. Our result implies a discrepancy between the experimental determination of $$a_{\\mu}$$ and the Standard Model of 3$$\\sigma$$.« less

  14. Variety Preserved Instance Weighting and Prototype Selection for Probabilistic Multiple Scope Simulations

    DTIC Science & Technology

    2017-05-30

    including analysis, control and management of the systems across their multiple scopes . These difficulties will become more significant in near future...behaviors of the systems , it tends to cover their many scopes . Accordingly, we may obtain better models for the simulations in a data-driven manner...to capture variety of the instance distribution in a given data set for covering multiple scopes of our objective system in a seamless manner. (2

  15. Approximation of reliabilities for multiple-trait model with maternal effects.

    PubMed

    Strabel, T; Misztal, I; Bertrand, J K

    2001-04-01

    Reliabilities for a multiple-trait maternal model were obtained by combining reliabilities obtained from single-trait models. Single-trait reliabilities were obtained using an approximation that supported models with additive and permanent environmental effects. For the direct effect, the maternal and permanent environmental variances were assigned to the residual. For the maternal effect, variance of the direct effect was assigned to the residual. Data included 10,550 birth weight, 11,819 weaning weight, and 3,617 postweaning gain records of Senepol cattle. Reliabilities were obtained by generalized inversion and by using single-trait and multiple-trait approximation methods. Some reliabilities obtained by inversion were negative because inbreeding was ignored in calculating the inverse of the relationship matrix. The multiple-trait approximation method reduced the bias of approximation when compared with the single-trait method. The correlations between reliabilities obtained by inversion and by multiple-trait procedures for the direct effect were 0.85 for birth weight, 0.94 for weaning weight, and 0.96 for postweaning gain. Correlations for maternal effects for birth weight and weaning weight were 0.96 to 0.98 for both approximations. Further improvements can be achieved by refining the single-trait procedures.

  16. An Effect Size for Regression Predictors in Meta-Analysis

    ERIC Educational Resources Information Center

    Aloe, Ariel M.; Becker, Betsy Jane

    2012-01-01

    A new effect size representing the predictive power of an independent variable from a multiple regression model is presented. The index, denoted as r[subscript sp], is the semipartial correlation of the predictor with the outcome of interest. This effect size can be computed when multiple predictor variables are included in the regression model…

  17. Neuropsychological Predictors of Math Calculation and Reasoning in School-Aged Children

    ERIC Educational Resources Information Center

    Schneider, Dana Lynn

    2012-01-01

    After multiple reviews of the literature, which documented that multiple cognitive processes may be involved in mathematics ability and disability, Geary (1993) proposed a model that included three subtypes of math disability: Semantic, Procedural, and Visuospatial. A review of the extant literature produced three studies that examined Geary's…

  18. Women Confronting the Reality of Multiple Sclerosis: A Qualitative Model of Self-Healing

    ERIC Educational Resources Information Center

    Romagosa, Carol J.

    2010-01-01

    Multiple sclerosis (MS) is a chronic debilitating disease that has an uncertain course. Although uncertainty is a universal experience in chronic illness, uncertainty in MS is especially threatening to psychological well-being. Chronic illness, including conditions of disability, is one of our greatest health care problems as society ages. Never…

  19. Trust Discovery in Online Communities

    ERIC Educational Resources Information Center

    Piorkowski, John

    2014-01-01

    This research aims to discover interpersonal trust in online communities. Two novel trust models are built to explain interpersonal trust in online communities drawing theories and models from multiple relevant areas, including organizational trust models, trust in virtual settings, speech act theory, identity theory, and common bond theory. In…

  20. Bayesian meta-analytical methods to incorporate multiple surrogate endpoints in drug development process.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Riley, Richard D; Abrams, Keith R

    2016-03-30

    A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  1. Stochastic nature of Landsat MSS data

    NASA Technical Reports Server (NTRS)

    Labovitz, M. L.; Masuoka, E. J.

    1987-01-01

    A multiple series generalization of the ARIMA models is used to model Landsat MSS scan lines as sequences of vectors, each vector having four elements (bands). The purpose of this work is to investigate if Landsat scan lines can be described by a general multiple series linear stochastic model and if the coefficients of such a model vary as a function of satellite system and target attributes. To accomplish this objective, an exploratory experimental design was set up incorporating six factors, four representing target attributes - location, cloud cover, row (within location), and column (within location) - and two factors representing system attributes - satellite number and detector bank. Each factor was included in the design at two levels and, with two replicates per treatment, 128 scan lines were analyzed. The results of the analysis suggests that a multiple AR(4) model is an adequate representation across all scan lines. Furthermore, the coefficients of the AR(4) model vary with location, particularly changes in physiography (slope regimes), and with percent cloud cover, but are insensitive to changes in system attributes.

  2. Modeling Smoke Plume-Rise and Dispersion from Southern United States Prescribed Burns with Daysmoke

    Treesearch

    G L Achtemeier; S L Goodrick; Y Liu; F Garcia-Menendez; Y Hu; M. Odman

    2011-01-01

    We present Daysmoke, an empirical-statistical plume rise and dispersion model for simulating smoke from prescribed burns. Prescribed fires are characterized by complex plume structure including multiple-core updrafts which makes modeling with simple plume models difficult. Daysmoke accounts for plume structure in a three-dimensional veering/sheering atmospheric...

  3. A comparison study of one-and two-dimensional hydraulic models for river environments : technical summary.

    DOT National Transportation Integrated Search

    2017-05-01

    Findings in this report include: differences in the flow divisions for multiple opening bridges for all : three models, less subjectivity in the construction of the 2D models than for the 1D, differences in the : sensitivity of each 2D model to the M...

  4. A new adaptive multiple modelling approach for non-linear and non-stationary systems

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Gong, Yu; Hong, Xia

    2016-07-01

    This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

  5. Multiple lobes in the far-field distribution of terahertz quantum-cascade lasers due to self-interference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Röben, B., E-mail: roeben@pdi-berlin.de; Wienold, M.; Schrottke, L.

    2016-06-15

    The far-field distribution of the emission intensity of terahertz (THz) quantum-cascade lasers (QCLs) frequently exhibits multiple lobes instead of a single-lobed Gaussian distribution. We show that such multiple lobes can result from self-interference related to the typically large beam divergence of THz QCLs and the presence of an inevitable cryogenic operation environment including optical windows. We develop a quantitative model to reproduce the multiple lobes. We also demonstrate how a single-lobed far-field distribution can be achieved.

  6. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    PubMed

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on outcomes not directly considered by the studies included in the analysis. Accounting for the dependency between outcomes in a multivariate meta-analysis may or may not improve the precision of effect estimates from a network meta-analysis compared to analysing each outcome separately.

  7. The stochastic control of the F-8C aircraft using the Multiple Model Adaptive Control (MMAC) method

    NASA Technical Reports Server (NTRS)

    Athans, M.; Dunn, K. P.; Greene, E. S.; Lee, W. H.; Sandel, N. R., Jr.

    1975-01-01

    The purpose of this paper is to summarize results obtained for the adaptive control of the F-8C aircraft using the so-called Multiple Model Adaptive Control method. The discussion includes the selection of the performance criteria for both the lateral and the longitudinal dynamics, the design of the Kalman filters for different flight conditions, the 'identification' aspects of the design using hypothesis testing ideas, and the performance of the closed loop adaptive system.

  8. Rope Hadronization and Strange Particle Production

    NASA Astrophysics Data System (ADS)

    Bierlich, Christian

    2018-02-01

    Rope Hadronization is a model extending the Lund string hadronization model to describe environments with many overlapping strings, such as high multiplicity pp collisions or AA collisions. Including effects of Rope Hadronization drastically improves description of strange/non-strange hadron ratios as function of event multiplicity in all systems from e+e- to AA. Implementation of Rope Hadronization in the MC event generators Dipsy and PYTHIA8 is discussed, as well as future prospects for jet studies and studies of small systems.

  9. Genomic-based multiple-trait evaluation in Eucalyptus grandis using dominant DArT markers.

    PubMed

    Cappa, Eduardo P; El-Kassaby, Yousry A; Muñoz, Facundo; Garcia, Martín N; Villalba, Pamela V; Klápště, Jaroslav; Marcucci Poltri, Susana N

    2018-06-01

    We investigated the impact of combining the pedigree- and genomic-based relationship matrices in a multiple-trait individual-tree mixed model (a.k.a., multiple-trait combined approach) on the estimates of heritability and on the genomic correlations between growth and stem straightness in an open-pollinated Eucalyptus grandis population. Additionally, the added advantage of incorporating genomic information on the theoretical accuracies of parents and offspring breeding values was evaluated. Our results suggested that the use of the combined approach for estimating heritabilities and additive genetic correlations in multiple-trait evaluations is advantageous and including genomic information increases the expected accuracy of breeding values. Furthermore, the multiple-trait combined approach was proven to be superior to the single-trait combined approach in predicting breeding values, in particular for low-heritability traits. Finally, our results advocate the use of the combined approach in forest tree progeny testing trials, specifically when a multiple-trait individual-tree mixed model is considered. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Step wise, multiple objective calibration of a hydrologic model for a snowmelt dominated basin

    USGS Publications Warehouse

    Hay, L.E.; Leavesley, G.H.; Clark, M.P.; Markstrom, S.L.; Viger, R.J.; Umemoto, M.

    2006-01-01

    The ability to apply a hydrologic model to large numbers of basins for forecasting purposes requires a quick and effective calibration strategy. This paper presents a step wise, multiple objective, automated procedure for hydrologic model calibration. This procedure includes the sequential calibration of a model's simulation of solar radiation (SR), potential evapotranspiration (PET), water balance, and daily runoff. The procedure uses the Shuffled Complex Evolution global search algorithm to calibrate the U.S. Geological Survey's Precipitation Runoff Modeling System in the Yampa River basin of Colorado. This process assures that intermediate states of the model (SR and PET on a monthly mean basis), as well as the water balance and components of the daily hydrograph are simulated, consistently with measured values.

  11. Cognitive Phenotype of Velocardiofacial Syndrome: A Review

    ERIC Educational Resources Information Center

    Furniss, Frederick; Biswas, Asit B.; Gumber, Rohit; Singh, Niraj

    2011-01-01

    The behavioural phenotype of velocardiofacial syndrome (VCFS), one of the most common human multiple anomaly syndromes, includes developmental disabilities, frequently including intellectual disability (ID) and high risk of diagnosis of psychotic disorders including schizophrenia. VCFS may offer a model of the relationship between ID and risk of…

  12. Cost-Effectiveness of POC Coagulation Testing Using Multiple Electrode Aggregometry.

    PubMed

    Straub, Niels; Bauer, Ekaterina; Agarwal, Seema; Meybohm, Patrick; Zacharowski, Kai; Hanke, Alexander A; Weber, Christian F

    2016-01-01

    The economic effects of Point-of-Care (POC) coagulation testing including Multiple Electrode Aggregometry (MEA) with the Multiplate device have not been examined. A health economic model with associated clinical endpoints was developed to calculate the effectiveness and estimated costs of coagulation analyses based on standard laboratory testing (SLT) or POC testing offering the possibility to assess platelet dysfunction using aggregometric measures. Cost estimates included pre- and perioperative costs of hemotherapy, intra- and post-operative coagulation testing costs, and hospitalization costs, including the costs of transfusion-related complications. Our model calculation using a simulated true-to-life cohort of 10,000 cardiac surgery patients assigned to each testing alternative demonstrated that there were 950 fewer patients in the POC branch who required any transfusion of red blood cells. The subsequent numbers of massive transfusions and patients with transfusion-related complications were reduced with the POC testing by 284 and 126, respectively. The average expected total cost in the POC branch was 288 Euro lower for every treated patient than that in the SLT branch. Incorporating aggregometric analyses using MEA into hemotherapy algorithms improved medical outcomes in cardiac surgery patients in the presented health economic model. There was an overall better economic outcome associated with POC testing compared with SLT testing despite the higher costs of testing.

  13. Venus Global Reference Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.

    2017-01-01

    Venus Global Reference Atmospheric Model (Venus-GRAM) is an engineering-level atmospheric model developed by MSFC that is widely used for diverse mission applications including: Systems design; Performance analysis; Operations planning for aerobraking, Entry, Descent and Landing, and aerocapture; Is not a forecast model; Outputs include density, temperature, pressure, wind components, and chemical composition; Provides dispersions of thermodynamic parameters, winds, and density; Optional trajectory and auxiliary profile input files Has been used in multiple studies and proposals including NASA Engineering and Safety Center (NESC) Autonomous Aerobraking and various Discovery proposals; Released in 2005; Available at: https://software.nasa.gov/software/MFS-32314-1.

  14. A constitutive model for the forces of a magnetic bearing including eddy currents

    NASA Technical Reports Server (NTRS)

    Taylor, D. L.; Hebbale, K. V.

    1993-01-01

    A multiple magnet bearing can be developed from N individual electromagnets. The constitutive relationships for a single magnet in such a bearing is presented. Analytical expressions are developed for a magnet with poles arranged circumferencially. Maxwell's field equations are used so the model easily includes the effects of induced eddy currents due to the rotation of the journal. Eddy currents must be included in any dynamic model because they are the only speed dependent parameter and may lead to a critical speed for the bearing. The model is applicable to bearings using attraction or repulsion.

  15. AgMIP: Next Generation Models and Assessments

    NASA Astrophysics Data System (ADS)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6 that involves the key modeling groups from around the world including North America, Europe, South America, Sub-Saharan Africa, South Asia, East Asia, and Australia and Oceania. This community process will lead to mutually agreed protocols for coordinated global and regional assessments.

  16. Online Testing: The Dog Sat on My Keyboard.

    ERIC Educational Resources Information Center

    White, Jacci

    This paper will highlight some advantages and disadvantages of several online models for student assessment. These models will include: live exams, multiple choice tests, essay exams, and student projects. In addition, real student responses and "problems" will be used as prompts to improve models of authentic online assessment in mathematics.…

  17. Multiplicity Control in Structural Equation Modeling: Incorporating Parameter Dependencies

    ERIC Educational Resources Information Center

    Smith, Carrie E.; Cribbie, Robert A.

    2013-01-01

    When structural equation modeling (SEM) analyses are conducted, significance tests for all important model relationships (parameters including factor loadings, covariances, etc.) are typically conducted at a specified nominal Type I error rate ([alpha]). Despite the fact that many significance tests are often conducted in SEM, rarely is…

  18. The Performance of IRT Model Selection Methods with Mixed-Format Tests

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G.

    2012-01-01

    When tests consist of multiple-choice and constructed-response items, researchers are confronted with the question of which item response theory (IRT) model combination will appropriately represent the data collected from these mixed-format tests. This simulation study examined the performance of six model selection criteria, including the…

  19. Multiple memory systems as substrates for multiple decision systems

    PubMed Central

    Doll, Bradley B.; Shohamy, Daphna; Daw, Nathaniel D.

    2014-01-01

    It has recently become widely appreciated that value-based decision making is supported by multiple computational strategies. In particular, animal and human behavior in learning tasks appears to include habitual responses described by prominent model-free reinforcement learning (RL) theories, but also more deliberative or goal-directed actions that can be characterized by a different class of theories, model-based RL. The latter theories evaluate actions by using a representation of the contingencies of the task (as with a learned map of a spatial maze), called an “internal model.” Given the evidence of behavioral and neural dissociations between these approaches, they are often characterized as dissociable learning systems, though they likely interact and share common mechanisms. In many respects, this division parallels a longstanding dissociation in cognitive neuroscience between multiple memory systems, describing, at the broadest level, separate systems for declarative and procedural learning. Procedural learning has notable parallels with model-free RL: both involve learning of habits and both are known to depend on parts of the striatum. Declarative memory, by contrast, supports memory for single events or episodes and depends on the hippocampus. The hippocampus is thought to support declarative memory by encoding temporal and spatial relations among stimuli and thus is often referred to as a relational memory system. Such relational encoding is likely to play an important role in learning an internal model, the representation that is central to model-based RL. Thus, insofar as the memory systems represent more general-purpose cognitive mechanisms that might subserve performance on many sorts of tasks including decision making, these parallels raise the question whether the multiple decision systems are served by multiple memory systems, such that one dissociation is grounded in the other. Here we investigated the relationship between model-based RL and relational memory by comparing individual differences across behavioral tasks designed to measure either capacity. Human subjects performed two tasks, a learning and generalization task (acquired equivalence) which involves relational encoding and depends on the hippocampus; and a sequential RL task that could be solved by either a model-based or model-free strategy. We assessed the correlation between subjects’ use of flexible, relational memory, as measured by generalization in the acquired equivalence task, and their differential reliance on either RL strategy in the decision task. We observed a significant positive relationship between generalization and model-based, but not model-free, choice strategies. These results are consistent with the hypothesis that model-based RL, like acquired equivalence, relies on a more general-purpose relational memory system. PMID:24846190

  20. More Than Just Accuracy: A Novel Method to Incorporate Multiple Test Attributes in Evaluating Diagnostic Tests Including Point of Care Tests.

    PubMed

    Thompson, Matthew; Weigl, Bernhard; Fitzpatrick, Annette; Ide, Nicole

    2016-01-01

    Current frameworks for evaluating diagnostic tests are constrained by a focus on diagnostic accuracy, and assume that all aspects of the testing process and test attributes are discrete and equally important. Determining the balance between the benefits and harms associated with new or existing tests has been overlooked. Yet, this is critically important information for stakeholders involved in developing, testing, and implementing tests. This is particularly important for point of care tests (POCTs) where tradeoffs exist between numerous aspects of the testing process and test attributes. We developed a new model that multiple stakeholders (e.g., clinicians, patients, researchers, test developers, industry, regulators, and health care funders) can use to visualize the multiple attributes of tests, the interactions that occur between these attributes, and their impacts on health outcomes. We use multiple examples to illustrate interactions between test attributes (test availability, test experience, and test results) and outcomes, including several POCTs. The model could be used to prioritize research and development efforts, and inform regulatory submissions for new diagnostics. It could potentially provide a way to incorporate the relative weights that various subgroups or clinical settings might place on different test attributes. Our model provides a novel way that multiple stakeholders can use to visualize test attributes, their interactions, and impacts on individual and population outcomes. We anticipate that this will facilitate more informed decision making around diagnostic tests.

  1. Constraint Based Modeling Going Multicellular.

    PubMed

    Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas

    2016-01-01

    Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.

  2. BOREAS RSS-8 BIOME-BGC Model Simulations at Tower Flux Sites in 1994

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Kimball, John

    2000-01-01

    BIOME-BGC is a general ecosystem process model designed to simulate biogeochemical and hydrologic processes across multiple scales (Running and Hunt, 1993). In this investigation, BIOME-BGC was used to estimate daily water and carbon budgets for the BOREAS tower flux sites for 1994. Carbon variables estimated by the model include gross primary production (i.e., net photosynthesis), maintenance and heterotrophic respiration, net primary production, and net ecosystem carbon exchange. Hydrologic variables estimated by the model include snowcover, evaporation, transpiration, evapotranspiration, soil moisture, and outflow. The information provided by the investigation includes input initialization and model output files for various sites in tabular ASCII format.

  3. Combining lightning leader and relativistic feedback discharge models of terrestrial gamma-ray flashes

    NASA Astrophysics Data System (ADS)

    Dwyer, J. R.

    2016-12-01

    Lightning leader models of terrestrial gamma-ray flashes (TGFs) are based on the observations that leaders emit bursts of hard x-rays. These x-rays are thought to be generated by runaway electrons created in the high-field regions associated with the leader tips and/or streamers heads. Inside a thunderstorm, it has been proposed that these runaway electrons may experience additional relativistic runaway electron avalanche (RREA) multiplication, increasing the number and the average energy of the electrons, and possibly resulting in a TGF. When modeling TGFs it is important to include the discharge currents resulting from the ionization produced by the runaway electrons, since these currents may alter the electric fields and affect the TGF. In addition, relativistic feedback effects, caused by backward propagating positrons and backscattered x-rays, need to be included, since relativistic feedback limits the size of the electric field and the amount of a RREA multiplication that may occur. In this presentation, a lightning leader model of terrestrial gamma-ray flashes that includes the effects of the discharge currents and relativistic feedback will be described and compared with observations.

  4. An analysis of stakeholder views on children's mental health services.

    PubMed

    Rodríguez, Adriana; Southam-Gerow, Michael A; O'Connor, Mary Katherine; Allin, Robert B

    2014-01-01

    The purpose was to examine the perspectives of mental health stakeholders as a means to guide the adaptation of evidence-based treatments. The Mental Health System Ecological (MHSE) model was used to organize therapist, administrator, and parent perspectives gathered using qualitative methods. The MHSE model posits the influences of client-level, provider-level, intervention-specific, service delivery, organizational, and service system characteristics on implementation. Focus groups and interviews were conducted with community mental health stakeholders and included parents, therapists, and administrators (N = 21). Participants included 11 primarily Caucasian (90.48%) female participants, ranging in ages between 31 and 57 years. Data were analyzed according to the MHSE model. Frequency counts were tabulated for each theme and stakeholder group differences were determined using the Mann-Whitney test. Stakeholder groups mentioned needs at each level of the MHSE model. Stakeholder group differences most notably emerged with child and family themes, which included complexity of mental health issues, parenting differences, and family stressors. Stakeholders identified challenges for optimal mental health services for children across multiple levels of an ecological model. Implications of the findings are discussed, including the continued relevance of adapting mental health interventions by increasing their flexible application across multiple target problems and the promise of partnership approaches.

  5. Trade-Space Analysis Tool for Constellations (TAT-C)

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.

  6. Trade-space Analysis for Constellations

    NASA Astrophysics Data System (ADS)

    Le Moigne, J.; Dabney, P.; de Weck, O. L.; Foreman, V.; Grogan, P.; Holland, M. P.; Hughes, S. P.; Nag, S.

    2016-12-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: "How many spacecraft should be included in the constellation? Which design has the best cost/risk value?" The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time. This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance. TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.

  7. Multiple switching modes and multiple level states in memristive devices

    NASA Astrophysics Data System (ADS)

    Miao, Feng; Yang, J. Joshua; Borghetti, Julien; Strachan, John Paul; Zhang, M.-X.; Goldfarb, Ilan; Medeiros-Ribeiro, Gilberto; Williams, R. Stanley

    2011-03-01

    As one of the most promising technologies for next generation non-volatile memory, metal oxide based memristive devices have demonstrated great advantages on scalability, operating speed and power consumption. Here we report the observation of multiple switching modes and multiple level states in different memristive systems. The multiple switching modes can be obtained by limiting the current during electroforming, and related transport behaviors, including ionic and electronic motions, are characterized. Such observation can be rationalized by a model of two effective switching layers adjacent to the bottom and top electrodes. Multiple level states, corresponding to different composition of the conducting channel, will also be discussed in the context of multiple-level storage for high density, non-volatile memory applications.

  8. Mesoscopic Modeling of Blood Clotting: Coagulation Cascade and Platelets Adhesion

    NASA Astrophysics Data System (ADS)

    Yazdani, Alireza; Li, Zhen; Karniadakis, George

    2015-11-01

    The process of clot formation and growth at a site on a blood vessel wall involve a number of multi-scale simultaneous processes including: multiple chemical reactions in the coagulation cascade, species transport and flow. To model these processes we have incorporated advection-diffusion-reaction (ADR) of multiple species into an extended version of Dissipative Particle Dynamics (DPD) method which is considered as a coarse-grained Molecular Dynamics method. At the continuum level this is equivalent to the Navier-Stokes equation plus one advection-diffusion equation for each specie. The chemistry of clot formation is now understood to be determined by mechanisms involving reactions among many species in dilute solution, where reaction rate constants and species diffusion coefficients in plasma are known. The role of blood particulates, i.e. red cells and platelets, in the clotting process is studied by including them separately and together in the simulations. An agonist-induced platelet activation mechanism is presented, while platelets adhesive dynamics based on a stochastic bond formation/dissociation process is included in the model.

  9. Helping agencies improve their planning analysis techniques.

    DOT National Transportation Integrated Search

    2011-11-18

    This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...

  10. Modeling and Simulation of Lab-on-a-Chip Systems

    DTIC Science & Technology

    2005-08-12

    complex chip geometries (including multiple turns). Variations of sample concentration profiles in laminar diffusion-based micromixers are also derived...CHAPTER 6 MODELING OF LAMINAR DIFFUSION-BASED COMPLEX ELECTROKINETIC PASSIVE MICROMIXERS ...140 6.4.4 Multi-Stream (Inter-Digital) Micromixers

  11. Using agent-based modeling to study multiple risk factors and multiple health outcomes at multiple levels.

    PubMed

    Yang, Yong

    2017-11-01

    Most health studies focus on one health outcome and examine the influence of one or multiple risk factors. However, in reality, various pathways, interactions, and associations exist not only between risk factors and health outcomes but also among the risk factors and among health outcomes. The advance of system science methods, Big Data, and accumulated knowledge allows us to examine how multiple risk factors influence multiple health outcomes at multiple levels (termed a 3M study). Using the study of neighborhood environment and health as an example, I elaborate on the significance of 3M studies. 3M studies may lead to a significantly deeper understanding of the dynamic interactions among risk factors and outcomes and could help us design better interventions that may be of particular relevance for upstream interventions. Agent-based modeling (ABM) is a promising method in the 3M study, although its potentials are far from being fully explored. Future challenges include the gap of epidemiologic knowledge and evidence, lack of empirical data sources, and the technical challenges of ABM. © 2017 New York Academy of Sciences.

  12. Impact of airway gas exchange on the multiple inert gas elimination technique: theory.

    PubMed

    Anderson, Joseph C; Hlastala, Michael P

    2010-03-01

    The multiple inert gas elimination technique (MIGET) provides a method for estimating alveolar gas exchange efficiency. Six soluble inert gases are infused into a peripheral vein. Measurements of these gases in breath, arterial blood, and venous blood are interpreted using a mathematical model of alveolar gas exchange (MIGET model) that neglects airway gas exchange. A mathematical model describing airway and alveolar gas exchange predicts that two of these gases, ether and acetone, exchange primarily within the airways. To determine the effect of airway gas exchange on the MIGET, we selected two additional gases, toluene and m-dichlorobenzene, that have the same blood solubility as ether and acetone and minimize airway gas exchange via their low water solubility. The airway-alveolar gas exchange model simulated the exchange of toluene, m-dichlorobenzene, and the six MIGET gases under multiple conditions of alveolar ventilation-to-perfusion, VA/Q, heterogeneity. We increased the importance of airway gas exchange by changing bronchial blood flow, Qbr. From these simulations, we calculated the excretion and retention of the eight inert gases and divided the results into two groups: (1) the standard MIGET gases which included acetone and ether and (2) the modified MIGET gases which included toluene and m-dichlorobenzene. The MIGET mathematical model predicted distributions of ventilation and perfusion for each grouping of gases and multiple perturbations of VA/Q and Qbr. Using the modified MIGET gases, MIGET predicted a smaller dead space fraction, greater mean VA, greater log(SDVA), and more closely matched the imposed VA distribution than that using the standard MIGET gases. Perfusion distributions were relatively unaffected.

  13. Multiscale modeling of mucosal immune responses

    PubMed Central

    2015-01-01

    Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM. Background Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Implementation Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. Conclusion We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation. PMID:26329787

  14. Multiscale modeling of mucosal immune responses.

    PubMed

    Mei, Yongguo; Abedi, Vida; Carbo, Adria; Zhang, Xiaoying; Lu, Pinyi; Philipson, Casandra; Hontecillas, Raquel; Hoops, Stefan; Liles, Nathan; Bassaganya-Riera, Josep

    2015-01-01

    Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation.Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM.

  15. A distributed data base management facility for the CAD/CAM environment

    NASA Technical Reports Server (NTRS)

    Balza, R. M.; Beaudet, R. W.; Johnson, H. R.

    1984-01-01

    Current/PAD research in the area of distributed data base management considers facilities for supporting CAD/CAM data management in a heterogeneous network of computers encompassing multiple data base managers supporting a variety of data models. These facilities include coordinated execution of multiple DBMSs to provide for administration of and access to data distributed across them.

  16. Development of the Bonding Representations Inventory to Identify Student Misconceptions about Covalent and Ionic Bonding Representations

    ERIC Educational Resources Information Center

    Luxford, Cynthia J.; Bretz, Stacey Lowery

    2014-01-01

    Teachers use multiple representations to communicate the concepts of bonding, including Lewis structures, formulas, space-filling models, and 3D manipulatives. As students learn to interpret these multiple representations, they may develop misconceptions that can create problems in further learning of chemistry. Interviews were conducted with 28…

  17. Technical and Practical Issues in the Structure and Clinical Invariance of the Wechsler Scales: A Rejoinder to Commentaries

    ERIC Educational Resources Information Center

    Weiss, Lawrence G.; Keith, Timothy Z.; Zhu, Jianjun; Chen, Hsinyi

    2013-01-01

    This discussion article addresses issues related to expansion of the Wechsler model from four to five factors; multiple broad CHC abilities measured by the Arithmetic subtest; advantages and disadvantages of including complex tasks requiring integration of multiple broad abilities when measuring intelligence; limitations of factor analysis, which…

  18. Child Sexual Abuse and Adult Romantic Adjustment: Comparison of Single- and Multiple-Indicator Measures

    ERIC Educational Resources Information Center

    Godbout, Natacha; Sabourin, Stephane; Lussier, Yvan

    2009-01-01

    This study compared the usefulness of single- and multiple-indicator strategies in a model examining the role of child sexual abuse (CSA) to predict later marital satisfaction through attachment and psychological distress. The sample included 1,092 women and men from a nonclinical population in cohabiting or marital relationships. The single-item…

  19. Investigating the Resilience Levels of Parents with Children with Multiple Disabilities Based on Different Variables

    ERIC Educational Resources Information Center

    Kadi, Sinem; Eldeniz Cetin, Muzeyyen

    2018-01-01

    This study investigated the resilience levels of parents with children with multiple disabilities by utilizing different variables. The study, conducted with survey model--a qualitative method--included a sample composed of a total of 222 voluntary parents (183 females, 39 males) residing in Bolu, Duzce and Zonguldak in Turkey. Parental…

  20. Hadronic vacuum polarization contribution to aμ from full lattice QCD

    NASA Astrophysics Data System (ADS)

    Chakraborty, Bipasha; Davies, C. T. H.; de Oliveira, P. G.; Koponen, J.; Lepage, G. P.; van de Water, R. S.; Hpqcd Collaboration

    2017-08-01

    We determine the contribution to the anomalous magnetic moment of the muon from the αQED2 hadronic vacuum polarization diagram using full lattice QCD and including u /d quarks with physical masses for the first time. We use gluon field configurations that include u , d , s and c quarks in the sea at multiple values of the lattice spacing, multiple u /d masses and multiple volumes that allow us to include an analysis of finite-volume effects. We obtain a result for aμHVP ,LO of 667 (6 )(12 )×10-10, where the first error is from the lattice calculation and the second includes systematic errors from missing QED and isospin-breaking effects and from quark-line disconnected diagrams. Our result implies a discrepancy between the experimental determination of aμ and the Standard Model of 3 σ .

  1. Protein structure modeling for CASP10 by multiple layers of global optimization.

    PubMed

    Joo, Keehyoung; Lee, Juyong; Sim, Sangjin; Lee, Sun Young; Lee, Kiho; Heo, Seungryong; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2014-02-01

    In the template-based modeling (TBM) category of CASP10 experiment, we introduced a new protocol called protein modeling system (PMS) to generate accurate protein structures in terms of side-chains as well as backbone trace. In the new protocol, a global optimization algorithm, called conformational space annealing (CSA), is applied to the three layers of TBM procedure: multiple sequence-structure alignment, 3D chain building, and side-chain re-modeling. For 3D chain building, we developed a new energy function which includes new distance restraint terms of Lorentzian type (derived from multiple templates), and new energy terms that combine (physical) energy terms such as dynamic fragment assembly (DFA) energy, DFIRE statistical potential energy, hydrogen bonding term, etc. These physical energy terms are expected to guide the structure modeling especially for loop regions where no template structures are available. In addition, we developed a new quality assessment method based on random forest machine learning algorithm to screen templates, multiple alignments, and final models. For TBM targets of CASP10, we find that, due to the combination of three stages of CSA global optimizations and quality assessment, the modeling accuracy of PMS improves at each additional stage of the protocol. It is especially noteworthy that the side-chains of the final PMS models are far more accurate than the models in the intermediate steps. Copyright © 2013 Wiley Periodicals, Inc.

  2. DHS Summary Report -- Robert Weldon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weldon, Robert A.

    This summer I worked on benchmarking the Lawrence Livermore National Laboratory fission multiplicity capability used in the Monte Carlo particle transport code MCNPX. This work involved running simulations and then comparing the simulation results with experimental experiments. Outlined in this paper is a brief description of the work completed this summer, skills and knowledge gained, and how the internship has impacted my planning for the future. Neutron multiplicity counting is a neutron detection technique that leverages the multiplicity emissions of neutrons from fission to identify various actinides in a lump of material. The identification of individual actinides in lumps ofmore » material crossing our boarders, especially U-235 and Pu-239, is a key component for maintaining the safety of the country from nuclear threats. Several multiplicity emission options from spontaneous and induced fission already existed in MCNPX 2.4.0. These options can be accessed through use of the 6th entry on the PHYS:N card. Lawrence Livermore National Laboratory (LLNL) developed a physics model for the simulation of neutron and gamma ray emission from fission and photofission that was included in MCNPX 2.7.B as an undocumented feature and then was documented in MCNPX 2.7.C. The LLNL multiplicity capability provided a different means for MCNPX to simulate neutron and gamma-ray distributions for neutron induced, spontaneous and photonuclear fission reactions. The original testing on the model for implementation into MCNPX was conducted by Gregg McKinney and John Hendricks. The model is an encapsulation of measured data of neutron multiplicity distributions from Gwin, Spencer, and Ingle, along with the data from Zucker and Holden. One of the founding principles of MCNPX was that it would have several redundant capabilities, providing the means of testing and including various physics packages. Though several multiplicity sampling methodologies already existed within MCNPX, the LLNL fission multiplicity was included to provide a separate capability for computing multiplicity as well as including several new features not already included in MCNPX. These new features include: (1) prompt gamma emission/multiplicity from neutron-induced fission; (2) neutron multiplicity and gamma emission/multiplicity from photofission; and (3) an option to enforce energy correlation for gamma neutron multiplicity emission. These new capabilities allow correlated signal detection for identifying presence of special nuclear material (SNM). Therefore, these new capabilities help meet the missions of the Domestic Nuclear Detection Office (DNDO), which is tasked with developing nuclear detection strategies for identifying potential radiological and nuclear threats, by providing new simulation capability for detection strategies that leverage the new available physics in the LLNL multiplicity capability. Two types of tests were accomplished this summer to test the default LLNL neutron multiplicity capability: neutron-induced fission tests and spontaneous fission tests. Both cases set the 6th entry on the PHYS:N card to 5 (i.e. use LLNL multiplicity). The neutron-induced fission tests utilized a simple 0.001 cm radius sphere where 0.0253 eV neutrons were released at the sphere center. Neutrons were forced to immediately collide in the sphere and release all progeny from the sphere, without further collision, using the LCA card, LCA 7j -2 (therefore density and size of the sphere were irrelevant). Enough particles were run to ensure that the average error of any specific multiplicity did not exceed 0.36%. Neutron-induced fission multiplicities were computed for U-233, U-235, Pu-239, and Pu-241. The spontaneous fission tests also used the same spherical geometry, except: (1) the LCA card was removed; (2) the density of the sphere was set to 0.001 g/cm3; and (3) instead of emitting a thermal neutron, the PAR keyword was set to PAR=SF. The purpose of the small density was to ensure that the spontaneous fission neutrons would not further interact and induce fissions (i.e. the mean free path greatly exceeded the size of the sphere). Enough particles were run to ensure that the average error of any specific spontaneous multiplicity did not exceed 0.23%. Spontaneous fission multiplicities were computed for U-238, Pu-238, Pu-240, Pu-242, Cm-242, and Cm-244. All of the computed results were compared against experimental results compiled by Holden at Brookhaven National Laboratory.« less

  3. Modeling How, When, and What Is Learned in a Simple Fault-Finding Task

    ERIC Educational Resources Information Center

    Ritter, Frank E.; Bibby, Peter A.

    2008-01-01

    We have developed a process model that learns in multiple ways while finding faults in a simple control panel device. The model predicts human participants' learning through its own learning. The model's performance was systematically compared to human learning data, including the time course and specific sequence of learned behaviors. These…

  4. FIRST RESULTS FROM OPERATIONAL TESTING OF THE U.S. EPA MODELS-3 COMMUNITY MULTISCALE MODEL FOR AIR QUALITY (CMAQ)

    EPA Science Inventory

    The Models 3 / Community Multiscale Model for Air Quality (CMAQ) has been designed for one-atmosphere assessments for multiple pollutants including ozone (O3), particulate matter (PM10, PM2.5), and acid / nutrient deposition. In this paper we report initial results of our evalu...

  5. The k-d Tree: A Hierarchical Model for Human Cognition.

    ERIC Educational Resources Information Center

    Vandendorpe, Mary M.

    This paper discusses a model of information storage and retrieval, the k-d tree (Bentley, 1975), a binary, hierarchical tree with multiple associate terms, which has been explored in computer research, and it is suggested that this model could be useful for describing human cognition. Included are two models of human long-term memory--networks and…

  6. Induced polarization and its interaction with electromagnetic coupling in low-frequency geophysical exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruszka, T.P.

    1987-01-01

    Starting from the dynamic equations of electromagnetics we derive mutual impedance formulas that include the effects of induced polarization (IP) and electromagnetic (EM) coupling. The mutual impedance formulas are given for four geometries: a fullspace, a cylinder in a fullspace, a halfspace, and a layer over a halfspace. IP effects are characterized by a Cole-Cole model, the properties of which are fully investigated. From the general mutual impedance formulas specific limiting forms are defined to characterize the IP and EM effects. Using these limiting forms a framework is developed to justify the addition or multiplication of the two effects. Themore » additive and multiplicative models are compared in the cylinder and layer geometries with the conclusion that the additive model proves to be more accurate over a wider range of frequencies than the multiplicative model. The nature of the IP and EM effects is illustrated in all four geometries showing the effects of relevant parameters. In all cases it is shown that the real part of the mutual impedance contains important IP information that is less influenced by EM effects. Finally the effects of boundaries are illustrated by the cylinder and layer geometries and a theory is developed to incorporate EM effects and IP effects from multiple regions which utilizes frequency dependent real dilution factors. The author also included a brief review of some EM removal schemes and dilution theory approximations.« less

  7. Genetic parameters for first lactation test-day milk flow in Holstein cows.

    PubMed

    Laureano, M M M; Bignardi, A B; El Faro, L; Cardoso, V L; Albuquerque, L G

    2012-01-01

    Genetic parameters for test-day milk flow (TDMF) of 2175 first lactations of Holstein cows were estimated using multiple-trait and repeatability models. The models included the direct additive genetic effect as a random effect and contemporary group (defined as the year and month of test) and age of cow at calving (linear and quadratic effect) as fixed effects. For the repeatability model, in addition to the effects cited, the permanent environmental effect of the animal was also included as a random effect. Variance components were estimated using the restricted maximum likelihood method in single- and multiple-trait and repeatability analyses. The heritability estimates for TDMF ranged from 0.23 (TDMF 6) to 0.32 (TDMF 2 and TDMF 4) in single-trait analysis and from 0.28 (TDMF 7 and TDMF 10) to 0.37 (TDMF 4) in multiple-trait analysis. In general, higher heritabilities were observed at the beginning of lactation until the fourth month. Heritability estimated with the repeatability model was 0.27 and the coefficient of repeatability for first lactation TDMF was 0.66. The genetic correlations were positive and ranged from 0.72 (TDMF 1 and 10) to 0.97 (TDMF 4 and 5). The results indicate that milk flow should respond satisfactorily to selection, promoting rapid genetic gains because the estimated heritabilities were moderate to high. Higher genetic gains might be obtained if selection was performed in the TDMF 4. Both the repeatability model and the multiple-trait model are adequate for the genetic evaluation of animals in terms of milk flow, but the latter provides more accurate estimates of breeding values.

  8. Nonlinear flow model of multiple fractured horizontal wells with stimulated reservoir volume including the quadratic gradient term

    NASA Astrophysics Data System (ADS)

    Ren, Junjie; Guo, Ping

    2017-11-01

    The real fluid flow in porous media is consistent with the mass conservation which can be described by the nonlinear governing equation including the quadratic gradient term (QGT). However, most of the flow models have been established by ignoring the QGT and little work has been conducted to incorporate the QGT into the flow model of the multiple fractured horizontal (MFH) well with stimulated reservoir volume (SRV). This paper first establishes a semi-analytical model of an MFH well with SRV including the QGT. Introducing the transformed pressure and flow-rate function, the nonlinear model of a point source in a composite system including the QGT is linearized. Then the Laplace transform, principle of superposition, numerical discrete method, Gaussian elimination method and Stehfest numerical inversion are employed to establish and solve the seepage model of the MFH well with SRV. Type curves are plotted and the effects of relevant parameters are analyzed. It is found that the nonlinear effect caused by the QGT can increase the flow capacity of fluid flow and influence the transient pressure positively. The relevant parameters not only have an effect on the type curve but also affect the error in the pressure calculated by the conventional linear model. The proposed model, which is consistent with the mass conservation, reflects the nonlinear process of the real fluid flow, and thus it can be used to obtain more accurate transient pressure of an MFH well with SRV.

  9. Effort to Accelerate MBSE Adoption and Usage at JSC

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Izygon, Michel; Okron, Shira; Garner, Larry; Wagner, Howard

    2016-01-01

    This paper describes the authors' experience in adopting Model Based System Engineering (MBSE) at the NASA/Johnson Space Center (JSC). Since 2009, NASA/JSC has been applying MBSE using the Systems Modeling Language (SysML) to a number of advanced projects. Models integrate views of the system from multiple perspectives, capturing the system design information for multiple stakeholders. This method has allowed engineers to better control changes, improve traceability from requirements to design and manage the numerous interactions between components. As the project progresses, the models become the official source of information and used by multiple stakeholders. Three major types of challenges that hamper the adoption of the MBSE technology are described. These challenges are addressed by a multipronged approach that includes educating the main stakeholders, implementing an organizational infrastructure that supports the adoption effort, defining a set of modeling guidelines to help engineers in their modeling effort, providing a toolset that support the generation of valuable products, and providing a library of reusable models. JSC project case studies are presented to illustrate how the proposed approach has been successfully applied.

  10. Progression of regional grey matter atrophy in multiple sclerosis

    PubMed Central

    Marinescu, Razvan V; Young, Alexandra L; Firth, Nicholas C; Jorge Cardoso, M; Tur, Carmen; De Angelis, Floriana; Cawley, Niamh; Brownlee, Wallace J; De Stefano, Nicola; Laura Stromillo, M; Battaglini, Marco; Ruggieri, Serena; Gasperini, Claudio; Filippi, Massimo; Rocca, Maria A; Rovira, Alex; Sastre-Garriga, Jaume; Geurts, Jeroen J G; Vrenken, Hugo; Wottschel, Viktor; Leurs, Cyra E; Uitdehaag, Bernard; Pirpamer, Lukas; Enzinger, Christian; Ourselin, Sebastien; Gandini Wheeler-Kingshott, Claudia A; Chard, Declan; Thompson, Alan J; Barkhof, Frederik; Alexander, Daniel C; Ciccarelli, Olga

    2018-01-01

    Abstract See Stankoff and Louapre (doi:10.1093/brain/awy114) for a scientific commentary on this article. Grey matter atrophy is present from the earliest stages of multiple sclerosis, but its temporal ordering is poorly understood. We aimed to determine the sequence in which grey matter regions become atrophic in multiple sclerosis and its association with disability accumulation. In this longitudinal study, we included 1417 subjects: 253 with clinically isolated syndrome, 708 with relapsing-remitting multiple sclerosis, 128 with secondary-progressive multiple sclerosis, 125 with primary-progressive multiple sclerosis, and 203 healthy control subjects from seven European centres. Subjects underwent repeated MRI (total number of scans 3604); the mean follow-up for patients was 2.41 years (standard deviation = 1.97). Disability was scored using the Expanded Disability Status Scale. We calculated the volume of brain grey matter regions and brainstem using an unbiased within-subject template and used an established data-driven event-based model to determine the sequence of occurrence of atrophy and its uncertainty. We assigned each subject to a specific event-based model stage, based on the number of their atrophic regions. Linear mixed-effects models were used to explore associations between the rate of increase in event-based model stages, and T2 lesion load, disease-modifying treatments, comorbidity, disease duration and disability accumulation. The first regions to become atrophic in patients with clinically isolated syndrome and relapse-onset multiple sclerosis were the posterior cingulate cortex and precuneus, followed by the middle cingulate cortex, brainstem and thalamus. A similar sequence of atrophy was detected in primary-progressive multiple sclerosis with the involvement of the thalamus, cuneus, precuneus, and pallidum, followed by the brainstem and posterior cingulate cortex. The cerebellum, caudate and putamen showed early atrophy in relapse-onset multiple sclerosis and late atrophy in primary-progressive multiple sclerosis. Patients with secondary-progressive multiple sclerosis showed the highest event-based model stage (the highest number of atrophic regions, P < 0.001) at the study entry. All multiple sclerosis phenotypes, but clinically isolated syndrome, showed a faster rate of increase in the event-based model stage than healthy controls. T2 lesion load and disease duration in all patients were associated with increased event-based model stage, but no effects of disease-modifying treatments and comorbidity on event-based model stage were observed. The annualized rate of event-based model stage was associated with the disability accumulation in relapsing-remitting multiple sclerosis, independent of disease duration (P < 0.0001). The data-driven staging of atrophy progression in a large multiple sclerosis sample demonstrates that grey matter atrophy spreads to involve more regions over time. The sequence in which regions become atrophic is reasonably consistent across multiple sclerosis phenotypes. The spread of atrophy was associated with disease duration and with disability accumulation over time in relapsing-remitting multiple sclerosis. PMID:29741648

  11. Progression of regional grey matter atrophy in multiple sclerosis.

    PubMed

    Eshaghi, Arman; Marinescu, Razvan V; Young, Alexandra L; Firth, Nicholas C; Prados, Ferran; Jorge Cardoso, M; Tur, Carmen; De Angelis, Floriana; Cawley, Niamh; Brownlee, Wallace J; De Stefano, Nicola; Laura Stromillo, M; Battaglini, Marco; Ruggieri, Serena; Gasperini, Claudio; Filippi, Massimo; Rocca, Maria A; Rovira, Alex; Sastre-Garriga, Jaume; Geurts, Jeroen J G; Vrenken, Hugo; Wottschel, Viktor; Leurs, Cyra E; Uitdehaag, Bernard; Pirpamer, Lukas; Enzinger, Christian; Ourselin, Sebastien; Gandini Wheeler-Kingshott, Claudia A; Chard, Declan; Thompson, Alan J; Barkhof, Frederik; Alexander, Daniel C; Ciccarelli, Olga

    2018-06-01

    See Stankoff and Louapre (doi:10.1093/brain/awy114) for a scientific commentary on this article.Grey matter atrophy is present from the earliest stages of multiple sclerosis, but its temporal ordering is poorly understood. We aimed to determine the sequence in which grey matter regions become atrophic in multiple sclerosis and its association with disability accumulation. In this longitudinal study, we included 1417 subjects: 253 with clinically isolated syndrome, 708 with relapsing-remitting multiple sclerosis, 128 with secondary-progressive multiple sclerosis, 125 with primary-progressive multiple sclerosis, and 203 healthy control subjects from seven European centres. Subjects underwent repeated MRI (total number of scans 3604); the mean follow-up for patients was 2.41 years (standard deviation = 1.97). Disability was scored using the Expanded Disability Status Scale. We calculated the volume of brain grey matter regions and brainstem using an unbiased within-subject template and used an established data-driven event-based model to determine the sequence of occurrence of atrophy and its uncertainty. We assigned each subject to a specific event-based model stage, based on the number of their atrophic regions. Linear mixed-effects models were used to explore associations between the rate of increase in event-based model stages, and T2 lesion load, disease-modifying treatments, comorbidity, disease duration and disability accumulation. The first regions to become atrophic in patients with clinically isolated syndrome and relapse-onset multiple sclerosis were the posterior cingulate cortex and precuneus, followed by the middle cingulate cortex, brainstem and thalamus. A similar sequence of atrophy was detected in primary-progressive multiple sclerosis with the involvement of the thalamus, cuneus, precuneus, and pallidum, followed by the brainstem and posterior cingulate cortex. The cerebellum, caudate and putamen showed early atrophy in relapse-onset multiple sclerosis and late atrophy in primary-progressive multiple sclerosis. Patients with secondary-progressive multiple sclerosis showed the highest event-based model stage (the highest number of atrophic regions, P < 0.001) at the study entry. All multiple sclerosis phenotypes, but clinically isolated syndrome, showed a faster rate of increase in the event-based model stage than healthy controls. T2 lesion load and disease duration in all patients were associated with increased event-based model stage, but no effects of disease-modifying treatments and comorbidity on event-based model stage were observed. The annualized rate of event-based model stage was associated with the disability accumulation in relapsing-remitting multiple sclerosis, independent of disease duration (P < 0.0001). The data-driven staging of atrophy progression in a large multiple sclerosis sample demonstrates that grey matter atrophy spreads to involve more regions over time. The sequence in which regions become atrophic is reasonably consistent across multiple sclerosis phenotypes. The spread of atrophy was associated with disease duration and with disability accumulation over time in relapsing-remitting multiple sclerosis.

  12. Multiple sparse volumetric priors for distributed EEG source reconstruction.

    PubMed

    Strobbe, Gregor; van Mierlo, Pieter; De Vos, Maarten; Mijović, Bogdan; Hallez, Hans; Van Huffel, Sabine; López, José David; Vandenberghe, Stefaan

    2014-10-15

    We revisit the multiple sparse priors (MSP) algorithm implemented in the statistical parametric mapping software (SPM) for distributed EEG source reconstruction (Friston et al., 2008). In the present implementation, multiple cortical patches are introduced as source priors based on a dipole source space restricted to a cortical surface mesh. In this note, we present a technique to construct volumetric cortical regions to introduce as source priors by restricting the dipole source space to a segmented gray matter layer and using a region growing approach. This extension allows to reconstruct brain structures besides the cortical surface and facilitates the use of more realistic volumetric head models including more layers, such as cerebrospinal fluid (CSF), compared to the standard 3-layered scalp-skull-brain head models. We illustrated the technique with ERP data and anatomical MR images in 12 subjects. Based on the segmented gray matter for each of the subjects, cortical regions were created and introduced as source priors for MSP-inversion assuming two types of head models. The standard 3-layered scalp-skull-brain head models and extended 4-layered head models including CSF. We compared these models with the current implementation by assessing the free energy corresponding with each of the reconstructions using Bayesian model selection for group studies. Strong evidence was found in favor of the volumetric MSP approach compared to the MSP approach based on cortical patches for both types of head models. Overall, the strongest evidence was found in favor of the volumetric MSP reconstructions based on the extended head models including CSF. These results were verified by comparing the reconstructed activity. The use of volumetric cortical regions as source priors is a useful complement to the present implementation as it allows to introduce more complex head models and volumetric source priors in future studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Modelling and simulation of biased agonism dynamics at a G protein-coupled receptor.

    PubMed

    Bridge, L J; Mead, J; Frattini, E; Winfield, I; Ladds, G

    2018-04-07

    Theoretical models of G protein-coupled receptor (GPCR) concentration-response relationships often assume an agonist producing a single functional response via a single active state of the receptor. These models have largely been analysed assuming steady-state conditions. There is now much experimental evidence to suggest that many GPCRs can exist in multiple receptor conformations and elicit numerous functional responses, with ligands having the potential to activate different signalling pathways to varying extents-a concept referred to as biased agonism, functional selectivity or pluri-dimensional efficacy. Moreover, recent experimental results indicate a clear possibility for time-dependent bias, whereby an agonist's bias with respect to different pathways may vary dynamically. Efforts towards understanding the implications of temporal bias by characterising and quantifying ligand effects on multiple pathways will clearly be aided by extending current equilibrium binding and biased activation models to include G protein activation dynamics. Here, we present a new model of time-dependent biased agonism, based on ordinary differential equations for multiple cubic ternary complex activation models with G protein cycle dynamics. This model allows simulation and analysis of multi-pathway activation bias dynamics at a single receptor for the first time, at the level of active G protein (α GTP ), towards the analysis of dynamic functional responses. The model is generally applicable to systems with N G G proteins and N* active receptor states. Numerical simulations for N G =N * =2 reveal new insights into the effects of system parameters (including cooperativities, and ligand and receptor concentrations) on bias dynamics, highlighting new phenomena including the dynamic inter-conversion of bias direction. Further, we fit this model to 'wet' experimental data for two competing G proteins (G i and G s ) that become activated upon stimulation of the adenosine A 1 receptor with adenosine derivative compounds. Finally, we show that our model can qualitatively describe the temporal dynamics of this competing G protein activation. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Bi-level Multi-Source Learning for Heterogeneous Block-wise Missing Data

    PubMed Central

    Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M.; Ye, Jieping

    2013-01-01

    Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified “bi-level” learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. PMID:23988272

  15. Bi-level multi-source learning for heterogeneous block-wise missing data.

    PubMed

    Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M; Ye, Jieping

    2014-11-15

    Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer's Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified "bi-level" learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. © 2013 Elsevier Inc. All rights reserved.

  16. A multiple-time-scale turbulence model based on variable partitioning of turbulent kinetic energy spectrum

    NASA Technical Reports Server (NTRS)

    Kim, S.-W.; Chen, C.-P.

    1987-01-01

    A multiple-time-scale turbulence model of a single point closure and a simplified split-spectrum method is presented. In the model, the effect of the ratio of the production rate to the dissipation rate on eddy viscosity is modeled by use of the multiple-time-scales and a variable partitioning of the turbulent kinetic energy spectrum. The concept of a variable partitioning of the turbulent kinetic energy spectrum and the rest of the model details are based on the previously reported algebraic stress turbulence model. Example problems considered include: a fully developed channel flow, a plane jet exhausting into a moving stream, a wall jet flow, and a weakly coupled wake-boundary layer interaction flow. The computational results compared favorably with those obtained by using the algebraic stress turbulence model as well as experimental data. The present turbulence model, as well as the algebraic stress turbulence model, yielded significantly improved computational results for the complex turbulent boundary layer flows, such as the wall jet flow and the wake boundary layer interaction flow, compared with available computational results obtained by using the standard kappa-epsilon turbulence model.

  17. A multiple-time-scale turbulence model based on variable partitioning of the turbulent kinetic energy spectrum

    NASA Technical Reports Server (NTRS)

    Kim, S.-W.; Chen, C.-P.

    1989-01-01

    A multiple-time-scale turbulence model of a single point closure and a simplified split-spectrum method is presented. In the model, the effect of the ratio of the production rate to the dissipation rate on eddy viscosity is modeled by use of the multiple-time-scales and a variable partitioning of the turbulent kinetic energy spectrum. The concept of a variable partitioning of the turbulent kinetic energy spectrum and the rest of the model details are based on the previously reported algebraic stress turbulence model. Example problems considered include: a fully developed channel flow, a plane jet exhausting into a moving stream, a wall jet flow, and a weakly coupled wake-boundary layer interaction flow. The computational results compared favorably with those obtained by using the algebraic stress turbulence model as well as experimental data. The present turbulence model, as well as the algebraic stress turbulence model, yielded significantly improved computational results for the complex turbulent boundary layer flows, such as the wall jet flow and the wake boundary layer interaction flow, compared with available computational results obtained by using the standard kappa-epsilon turbulence model.

  18. Multiple-hypothesis multiple-model line tracking

    NASA Astrophysics Data System (ADS)

    Pace, Donald W.; Owen, Mark W.; Cox, Henry

    2000-07-01

    Passive sonar signal processing generally includes tracking of narrowband and/or broadband signature components observed on a Lofargram or on a Bearing-Time-Record (BTR) display. Fielded line tracking approaches to date have been recursive and single-hypthesis-oriented Kalman- or alpha-beta filters, with no mechanism for considering tracking alternatives beyond the most recent scan of measurements. While adaptivity is often built into the filter to handle changing track dynamics, these approaches are still extensions of single target tracking solutions to multiple target tracking environment. This paper describes an application of multiple-hypothesis, multiple target tracking technology to the sonar line tracking problem. A Multiple Hypothesis Line Tracker (MHLT) is developed which retains the recursive minimum-mean-square-error tracking behavior of a Kalman Filter in a maximum-a-posteriori delayed-decision multiple hypothesis context. Multiple line track filter states are developed and maintained using the interacting multiple model (IMM) state representation. Further, the data association and assignment problem is enhanced by considering line attribute information (line bandwidth and SNR) in addition to beam/bearing and frequency fit. MHLT results on real sonar data are presented to demonstrate the benefits of the multiple hypothesis approach. The utility of the system in cluttered environments and particularly in crossing line situations is shown.

  19. PREDICTING ER BINDING AFFINITY FOR EDC RANKING AND PRIORITIZATION: MODEL I

    EPA Science Inventory

    A Common Reactivity Pattern (COREPA) model, based on consideration of multiple energetically reasonable conformations of flexible chemicals was developed using a training set of 232 rat estrogen receptor (rER) relative binding affinity (RBA) measurements. The training set include...

  20. A Model for Semantic Equivalence Discovery for Harmonizing Master Data

    NASA Astrophysics Data System (ADS)

    Piprani, Baba

    IT projects often face the challenge of harmonizing metadata and data so as to have a "single" version of the truth. Determining equivalency of multiple data instances against the given type, or set of types, is mandatory in establishing master data legitimacy in a data set that contains multiple incarnations of instances belonging to the same semantic data record . The results of a real-life application define how measuring criteria and equivalence path determination were established via a set of "probes" in conjunction with a score-card approach. There is a need for a suite of supporting models to help determine master data equivalency towards entity resolution—including mapping models, transform models, selection models, match models, an audit and control model, a scorecard model, a rating model. An ORM schema defines the set of supporting models along with their incarnation into an attribute based model as implemented in an RDBMS.

  1. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  2. Model misspecification detection by means of multiple generator errors, using the observed potential map.

    PubMed

    Zhang, Z; Jewett, D L

    1994-01-01

    Due to model misspecification, currently-used Dipole Source Localization (DSL) methods may contain Multiple-Generator Errors (MulGenErrs) when fitting simultaneously-active dipoles. The size of the MulGenErr is a function of both the model used, and the dipole parameters, including the dipoles' waveforms (time-varying magnitudes). For a given fitting model, by examining the variation of the MulGenErrs (or the fit parameters) under different waveforms for the same generating-dipoles, the accuracy of the fitting model for this set of dipoles can be determined. This method of testing model misspecification can be applied to evoked potential maps even when the parameters of the generating-dipoles are unknown. The dipole parameters fitted in a model should only be accepted if the model can be shown to be sufficiently accurate.

  3. A Model for Evaluating Programs for the Gifted under Non-Experimental Conditions.

    ERIC Educational Resources Information Center

    Carter, Kyle R.

    1992-01-01

    The article presents and illustrates use of an evaluation model for assessing programs for the gifted where tight experimental control is not possible. The model consists of four components: ex post factor designs including intact groups; comparative evaluation; strength of treatment; and multiple outcome assessment from flexible data sources. (DB)

  4. Assessing High School Chemistry Students' Modeling Sub-Skills in a Computerized Molecular Modeling Learning Environment

    ERIC Educational Resources Information Center

    Dori, Yehudit Judy; Kaberman, Zvia

    2012-01-01

    Much knowledge in chemistry exists at a molecular level, inaccessible to direct perception. Chemistry instruction should therefore include multiple visual representations, such as molecular models and symbols. This study describes the implementation and assessment of a learning unit designed for 12th grade chemistry honors students. The organic…

  5. The Impact of Sample Size and Other Factors When Estimating Multilevel Logistic Models

    ERIC Educational Resources Information Center

    Schoeneberger, Jason A.

    2016-01-01

    The design of research studies utilizing binary multilevel models must necessarily incorporate knowledge of multiple factors, including estimation method, variance component size, or number of predictors, in addition to sample sizes. This Monte Carlo study examined the performance of random effect binary outcome multilevel models under varying…

  6. A Two-Parameter Latent Trait Model. Methodology Project.

    ERIC Educational Resources Information Center

    Choppin, Bruce

    On well-constructed multiple-choice tests, the most serious threat to measurement is not variation in item discrimination, but the guessing behavior that may be adopted by some students. Ways of ameliorating the effects of guessing are discussed, especially for problems in latent trait models. A new item response model, including an item parameter…

  7. The extraction of simple relationships in growth factor-specific multiple-input and multiple-output systems in cell-fate decisions by backward elimination PLS regression.

    PubMed

    Akimoto, Yuki; Yugi, Katsuyuki; Uda, Shinsuke; Kudo, Takamasa; Komori, Yasunori; Kubota, Hiroyuki; Kuroda, Shinya

    2013-01-01

    Cells use common signaling molecules for the selective control of downstream gene expression and cell-fate decisions. The relationship between signaling molecules and downstream gene expression and cellular phenotypes is a multiple-input and multiple-output (MIMO) system and is difficult to understand due to its complexity. For example, it has been reported that, in PC12 cells, different types of growth factors activate MAP kinases (MAPKs) including ERK, JNK, and p38, and CREB, for selective protein expression of immediate early genes (IEGs) such as c-FOS, c-JUN, EGR1, JUNB, and FOSB, leading to cell differentiation, proliferation and cell death; however, how multiple-inputs such as MAPKs and CREB regulate multiple-outputs such as expression of the IEGs and cellular phenotypes remains unclear. To address this issue, we employed a statistical method called partial least squares (PLS) regression, which involves a reduction of the dimensionality of the inputs and outputs into latent variables and a linear regression between these latent variables. We measured 1,200 data points for MAPKs and CREB as the inputs and 1,900 data points for IEGs and cellular phenotypes as the outputs, and we constructed the PLS model from these data. The PLS model highlighted the complexity of the MIMO system and growth factor-specific input-output relationships of cell-fate decisions in PC12 cells. Furthermore, to reduce the complexity, we applied a backward elimination method to the PLS regression, in which 60 input variables were reduced to 5 variables, including the phosphorylation of ERK at 10 min, CREB at 5 min and 60 min, AKT at 5 min and JNK at 30 min. The simple PLS model with only 5 input variables demonstrated a predictive ability comparable to that of the full PLS model. The 5 input variables effectively extracted the growth factor-specific simple relationships within the MIMO system in cell-fate decisions in PC12 cells.

  8. Site-wide seismic risk model for Savannah River Site nuclear facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eide, S.A.; Shay, R.S.; Durant, W.S.

    1993-09-01

    The 200,000 acre Savannah River Site (SRS) has nearly 30 nuclear facilities spread throughout the site. The safety of each facility has been established in facility-specific safety analysis reports (SARs). Each SAR contains an analysis of risk from seismic events to both on-site workers and the off-site population. Both radiological and chemical releases are considered, and air and water pathways are modeled. Risks to the general public are generally characterized by evaluating exposure to the maximally exposed individual located at the SRS boundary and to the off-site population located within 50 miles. Although the SARs are appropriate methods for studyingmore » individual facility risks, there is a class of accident initiators that can simultaneously affect several of all of the facilities, Examples include seismic events, strong winds or tornados, floods, and loss of off-site electrical power. Overall risk to the off-site population from such initiators is not covered by the individual SARs. In such cases multiple facility radionuclide or chemical releases could occur, and off-site exposure would be greater than that indicated in a single facility SAR. As a step towards an overall site-wide risk model that adequately addresses multiple facility releases, a site-wide seismic model for determining off-site risk has been developed for nuclear facilities at the SRS. Risk from seismic events up to the design basis earthquake (DBE) of 0.2 g (frequency of 2.0E-4/yr) is covered by the model. Present plans include expanding the scope of the model to include other types of initiators that can simultaneously affect multiple facilities.« less

  9. Climate Science's Globally Distributed Infrastructure

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2016-12-01

    The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.

  10. Advanced statistics: linear regression, part II: multiple linear regression.

    PubMed

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  11. Evaluating Anthropogenic Carbon Emissions in the Urban Salt Lake Valley through Inverse Modeling: Combining Long-term CO2 Observations and an Emission Inventory using a Multiple-box Atmospheric Model

    NASA Astrophysics Data System (ADS)

    Catharine, D.; Strong, C.; Lin, J. C.; Cherkaev, E.; Mitchell, L.; Stephens, B. B.; Ehleringer, J. R.

    2016-12-01

    The rising level of atmospheric carbon dioxide (CO2), driven by anthropogenic emissions, is the leading cause of enhanced radiative forcing. Increasing societal interest in reducing anthropogenic greenhouse gas emissions call for a computationally efficient method of evaluating anthropogenic CO2 source emissions, particularly if future mitigation actions are to be developed. A multiple-box atmospheric transport model was constructed in conjunction with a pre-existing fossil fuel CO2 emission inventory to estimate near-surface CO2 mole fractions and the associated anthropogenic CO2 emissions in the Salt Lake Valley (SLV) of northern Utah, a metropolitan area with a population of 1 million. A 15-year multi-site dataset of observed CO2 mole fractions is used in conjunction with the multiple-box model to develop an efficient method to constrain anthropogenic emissions through inverse modeling. Preliminary results of the multiple-box model CO2 inversion indicate that the pre-existing anthropogenic emission inventory may over-estimate CO2 emissions in the SLV. In addition, inversion results displaying a complex spatial and temporal distribution of urban emissions, including the effects of residential development and vehicular traffic will be discussed.

  12. Why do people appear not to extrapolate trajectories during multiple object tracking? A computational investigation

    PubMed Central

    Zhong, Sheng-hua; Ma, Zheng; Wilson, Colin; Liu, Yan; Flombaum, Jonathan I

    2014-01-01

    Intuitively, extrapolating object trajectories should make visual tracking more accurate. This has proven to be true in many contexts that involve tracking a single item. But surprisingly, when tracking multiple identical items in what is known as “multiple object tracking,” observers often appear to ignore direction of motion, relying instead on basic spatial memory. We investigated potential reasons for this behavior through probabilistic models that were endowed with perceptual limitations in the range of typical human observers, including noisy spatial perception. When we compared a model that weights its extrapolations relative to other sources of information about object position, and one that does not extrapolate at all, we found no reliable difference in performance, belying the intuition that extrapolation always benefits tracking. In follow-up experiments we found this to be true for a variety of models that weight observations and predictions in different ways; in some cases we even observed worse performance for models that use extrapolations compared to a model that does not at all. Ultimately, the best performing models either did not extrapolate, or extrapolated very conservatively, relying heavily on observations. These results illustrate the difficulty and attendant hazards of using noisy inputs to extrapolate the trajectories of multiple objects simultaneously in situations with targets and featurally confusable nontargets. PMID:25311300

  13. An integrated data model to estimate spatiotemporal occupancy, abundance, and colonization dynamics

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Esslinger, George G.; Bower, Michael R.; Hefley, Trevor J.

    2017-01-01

    Ecological invasions and colonizations occur dynamically through space and time. Estimating the distribution and abundance of colonizing species is critical for efficient management or conservation. We describe a statistical framework for simultaneously estimating spatiotemporal occupancy and abundance dynamics of a colonizing species. Our method accounts for several issues that are common when modeling spatiotemporal ecological data including multiple levels of detection probability, multiple data sources, and computational limitations that occur when making fine-scale inference over a large spatiotemporal domain. We apply the model to estimate the colonization dynamics of sea otters (Enhydra lutris) in Glacier Bay, in southeastern Alaska.

  14. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    NASA Astrophysics Data System (ADS)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  15. Conditional probability of rainfall extremes across multiple durations

    NASA Astrophysics Data System (ADS)

    Le, Phuong Dong; Leonard, Michael; Westra, Seth

    2017-04-01

    The conditional probability that extreme rainfall will occur at one location given that it is occurring at another location is critical in engineering design and management circumstances including planning of evacuation routes and the sitting of emergency infrastructure. A challenge with this conditional simulation is that in many situations the interest is not so much the conditional distributions of rainfall of the same duration at two locations, but rather the conditional distribution of flooding in two neighbouring catchments, which may be influenced by rainfall of different critical durations. To deal with this challenge, a model that can consider both spatial and duration dependence of extremes is required. The aim of this research is to develop a model that can take account both spatial dependence and duration dependence into the dependence structure of extreme rainfalls. To achieve this aim, this study is a first attempt at combining extreme rainfall for multiple durations within a spatial extreme model framework based on max-stable process theory. Max-stable processes provide a general framework for modelling multivariate extremes with spatial dependence for just a single duration extreme rainfall. To achieve dependence across multiple timescales, this study proposes a new approach that includes addition elements representing duration dependence of extremes to the covariance matrix of max-stable model. To improve the efficiency of calculation, a re-parameterization proposed by Koutsoyiannis et al. (1998) is used to reduce the number of parameters necessary to be estimated. This re-parameterization enables the GEV parameters to be represented as a function of timescale. A stepwise framework has been adopted to achieve the overall aims of this research. Firstly, the re-parameterization is used to define a new set of common parameters for marginal distribution across multiple durations. Secondly, spatial interpolation of the new parameter set is used to estimate marginal parameters across the full spatial domain. Finally, spatial interpolation result is used as initial condition to estimate dependence parameters via a likelihood function of max-stable model for multiple durations. The Hawkesbury-Nepean catchment near Sydney in Australia was selected as case study for this research. This catchment has 25 sub-daily rain gauges with the minimum record length of 24 years over a region of 300 km × 300 km area. The re-parameterization was applied for each station for durations from 1 hour to 24 hours and then is evaluated by comparing with the at-site fitted GEV. The evaluation showed that the average R2 for all station is around 0.80 with the range from 0.26 to 1.0. The output of re-parameterization then was used to construct the spatial surface based on covariates including longitude, latitude, and elevation. The dependence model showed good agreements between empirical extremal coefficient and theoretical extremal coefficient for multiple durations. For the overall model, a leave-one-out cross-validation for all stations showed it works well for 20 out of 25 stations. The potential application of this model framework was illustrated through a conditional map of return period and return level across multiple durations, both of which are important for engineering design and management.

  16. A hybrid approach to estimating national scale spatiotemporal variability of PM2.5 in the contiguous United States.

    PubMed

    Beckerman, Bernardo S; Jerrett, Michael; Serre, Marc; Martin, Randall V; Lee, Seung-Jae; van Donkelaar, Aaron; Ross, Zev; Su, Jason; Burnett, Richard T

    2013-07-02

    Airborne fine particulate matter exhibits spatiotemporal variability at multiple scales, which presents challenges to estimating exposures for health effects assessment. Here we created a model to predict ambient particulate matter less than 2.5 μm in aerodynamic diameter (PM2.5) across the contiguous United States to be applied to health effects modeling. We developed a hybrid approach combining a land use regression model (LUR) selected with a machine learning method, and Bayesian Maximum Entropy (BME) interpolation of the LUR space-time residuals. The PM2.5 data set included 104,172 monthly observations at 1464 monitoring locations with approximately 10% of locations reserved for cross-validation. LUR models were based on remote sensing estimates of PM2.5, land use and traffic indicators. Normalized cross-validated R(2) values for LUR were 0.63 and 0.11 with and without remote sensing, respectively, suggesting remote sensing is a strong predictor of ground-level concentrations. In the models including the BME interpolation of the residuals, cross-validated R(2) were 0.79 for both configurations; the model without remotely sensed data described more fine-scale variation than the model including remote sensing. Our results suggest that our modeling framework can predict ground-level concentrations of PM2.5 at multiple scales over the contiguous U.S.

  17. Integrated defense system overlaps as a disease model: with examples for multiple chemical sensitivity.

    PubMed Central

    Rowat, S C

    1998-01-01

    The central nervous, immune, and endocrine systems communicate through multiple common messengers. Over evolutionary time, what may be termed integrated defense system(s) (IDS) have developed to coordinate these communications for specific contexts; these include the stress response, acute-phase response, nonspecific immune response, immune response to antigen, kindling, tolerance, time-dependent sensitization, neurogenic switching, and traumatic dissociation (TD). These IDSs are described and their overlap is examined. Three models of disease production are generated: damage, in which IDSs function incorrectly; inadequate/inappropriate, in which IDS response is outstripped by a changing context; and evolving/learning, in which the IDS learned response to a context is deemed pathologic. Mechanisms of multiple chemical sensitivity (MCS) are developed from several IDS disease models. Model 1A is pesticide damage to the central nervous system, overlapping with body chemical burdens, TD, and chronic zinc deficiency; model 1B is benzene disruption of interleukin-1, overlapping with childhood developmental windows and hapten-antigenic spreading; and model 1C is autoimmunity to immunoglobulin-G (IgG), overlapping with spreading to other IgG-inducers, sudden spreading of inciters, and food-contaminating chemicals. Model 2A is chemical and stress overload, including comparison with the susceptibility/sensitization/triggering/spreading model; model 2B is genetic mercury allergy, overlapping with: heavy metals/zinc displacement and childhood/gestational mercury exposures; and model 3 is MCS as evolution and learning. Remarks are offered on current MCS research. Problems with clinical measurement are suggested on the basis of IDS models. Large-sample patient self-report epidemiology is described as an alternative or addition to clinical biomarker and animal testing. Images Figure 1 Figure 2 Figure 3 Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 PMID:9539008

  18. Coastal Modeling System: Mathematical Formulations and Numerical Methods

    DTIC Science & Technology

    2014-03-01

    sediment transport , and morphology change. The CMS was designed and developed for coastal inlets and navigation applications, including channel...numerical methods of hydrodynamic, salinity and sediment transport , and morphology change model CMS-Flow. The CMS- Flow uses the Finite Volume...and the influence of coastal structures. The implicit hydrodynamic model is coupled to a nonequilibrium transport model of multiple-sized total

  19. Instantiating the multiple levels of analysis perspective in a program of study on externalizing behavior

    PubMed Central

    Beauchaine, Theodore P.; Gatzke-Kopp, Lisa M.

    2014-01-01

    During the last quarter century, developmental psychopathology has become increasingly inclusive and now spans disciplines ranging from psychiatric genetics to primary prevention. As a result, developmental psychopathologists have extended traditional diathesis–stress and transactional models to include causal processes at and across all relevant levels of analysis. Such research is embodied in what is known as the multiple levels of analysis perspective. We describe how multiple levels of analysis research has informed our current thinking about antisocial and borderline personality development among trait impulsive and therefore vulnerable individuals. Our approach extends the multiple levels of analysis perspective beyond simple Biology × Environment interactions by evaluating impulsivity across physiological systems (genetic, autonomic, hormonal, neural), psychological constructs (social, affective, motivational), developmental epochs (preschool, middle childhood, adolescence, adulthood), sexes (male, female), and methods of inquiry (self-report, informant report, treatment outcome, cardiovascular, electrophysiological, neuroimaging). By conducting our research using any and all available methods across these levels of analysis, we have arrived at a developmental model of trait impulsivity that we believe confers a greater understanding of this highly heritable trait and captures at least some heterogeneity in key behavioral outcomes, including delinquency and suicide. PMID:22781868

  20. Evaluating AIDS Prevention: Contributions of Multiple Disciplines.

    ERIC Educational Resources Information Center

    Leviton, Laura C., Ed.; And Others

    1990-01-01

    Seven essays on efforts of evaluate prevention programs aimed at the acquired immune deficiency syndrome (AIDS) are presented. Topics include public health psychology, mathematical models of epidemiology, estimates of incubation periods, ethnographic evaluations of AIDS prevention programs, an AIDS education model, theory-based evaluation, and…

  1. EXPLAINING FOREST COMPOSITION AND BIOMASS ACROSS MULTIPLE BIOGEOGRAPHIC REGIONS

    EPA Science Inventory

    Current scientific concerns regarding the impacts of global change include the responses of forest composition and biomass to rapid changes in climate, and forest gap models, have often been used to address this issue. These models reflect the concept that forest composition and...

  2. Listening to food workers: Factors that impact proper health and hygiene practice in food service.

    PubMed

    Clayton, Megan L; Clegg Smith, Katherine; Neff, Roni A; Pollack, Keshia M; Ensminger, Margaret

    2015-01-01

    Foodborne disease is a significant problem worldwide. Research exploring sources of outbreaks indicates a pronounced role for food workers' improper health and hygiene practice. To investigate food workers' perceptions of factors that impact proper food safety practice. Interviews with food service workers in Baltimore, MD, USA discussing food safety practices and factors that impact implementation in the workplace. A social ecological model organizes multiple levels of influence on health and hygiene behavior. Issues raised by interviewees include factors across the five levels of the social ecological model, and confirm findings from previous work. Interviews also reveal many factors not highlighted in prior work, including issues with food service policies and procedures, working conditions (e.g., pay and benefits), community resources, and state and federal policies. Food safety interventions should adopt an ecological orientation that accounts for factors at multiple levels, including workers' social and structural context, that impact food safety practice.

  3. A Global Repository for Planet-Sized Experiments and Observations

    NASA Technical Reports Server (NTRS)

    Williams, Dean; Balaji, V.; Cinquini, Luca; Denvil, Sebastien; Duffy, Daniel; Evans, Ben; Ferraro, Robert D.; Hansen, Rose; Lautenschlager, Michael; Trenham, Claire

    2016-01-01

    Working across U.S. federal agencies, international agencies, and multiple worldwide data centers, and spanning seven international network organizations, the Earth System Grid Federation (ESGF) allows users to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP) output used by the Intergovernmental Panel on Climate Change assessment reports. Data served by ESGF not only include model output (i.e., CMIP simulation runs) but also include observational data from satellites and instruments, reanalyses, and generated images. Metadata summarize basic information about the data for fast and easy data discovery.

  4. Shape Transformations of Epithelial Shells

    PubMed Central

    Misra, Mahim; Audoly, Basile; Kevrekidis, Ioannis G.; Shvartsman, Stanislav Y.

    2016-01-01

    Regulated deformations of epithelial sheets are frequently foreshadowed by patterning of their mechanical properties. The connection between patterns of cell properties and the emerging tissue deformations is studied in multiple experimental systems, but the general principles remain poorly understood. For instance, it is in general unclear what determines the direction in which the patterned sheet is going to bend and whether the resulting shape transformation will be discontinuous or smooth. Here these questions are explored computationally, using vertex models of epithelial shells assembled from prismlike cells. In response to rings and patches of apical cell contractility, model epithelia smoothly deform into invaginated or evaginated shapes similar to those observed in embryos and tissue organoids. Most of the observed effects can be captured by a simpler model with polygonal cells, modified to include the effects of the apicobasal polarity and natural curvature of epithelia. Our models can be readily extended to include the effects of multiple constraints and used to describe a wide range of morphogenetic processes. PMID:27074691

  5. Modelling Concentrating Solar Power with Thermal Energy Storage for Integration Studies (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummon, M.; Jorgenson, J.; Denholm, P.

    2013-10-01

    Concentrating solar power with thermal energy storage (CSP-TES) can provide multiple benefits to the grid, including low marginal cost energy and the ability to levelize load, provide operating reserves, and provide firm capacity. It is challenging to properly value the integration of CSP because of the complicated nature of this technology. Unlike completely dispatchable fossil sources, CSP is a limited energy resource, depending on the hourly and daily supply of solar energy. To optimize the use of this limited energy, CSP-TES must be implemented in a production cost model with multiple decision variables for the operation of the CSP-TES plant.more » We develop and implement a CSP-TES plant in a production cost model that accurately characterizes the three main components of the plant: solar field, storage tank, and power block. We show the effect of various modelling simplifications on the value of CSP, including: scheduled versus optimized dispatch from the storage tank and energy-only operation versus co-optimization with ancillary services.« less

  6. Modelling Concentrating Solar Power with Thermal Energy Storage for Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummon, M.; Denholm, P.; Jorgenson, J.

    2013-10-01

    Concentrating solar power with thermal energy storage (CSP-TES) can provide multiple benefits to the grid, including low marginal cost energy and the ability to levelize load, provide operating reserves, and provide firm capacity. It is challenging to properly value the integration of CSP because of the complicated nature of this technology. Unlike completely dispatchable fossil sources, CSP is a limited energy resource, depending on the hourly and daily supply of solar energy. To optimize the use of this limited energy, CSP-TES must be implemented in a production cost model with multiple decision variables for the operation of the CSP-TES plant.more » We develop and implement a CSP-TES plant in a production cost model that accurately characterizes the three main components of the plant: solar field, storage tank, and power block. We show the effect of various modelling simplifications on the value of CSP, including: scheduled versus optimized dispatch from the storage tank and energy-only operation versus co-optimization with ancillary services.« less

  7. Impaired neurosteroid synthesis in multiple sclerosis

    PubMed Central

    Noorbakhsh, Farshid; Ellestad, Kristofor K.; Maingat, Ferdinand; Warren, Kenneth G.; Han, May H.; Steinman, Lawrence; Baker, Glen B.

    2011-01-01

    High-throughput technologies have led to advances in the recognition of disease pathways and their underlying mechanisms. To investigate the impact of micro-RNAs on the disease process in multiple sclerosis, a prototypic inflammatory neurological disorder, we examined cerebral white matter from patients with or without the disease by micro-RNA profiling, together with confirmatory reverse transcription–polymerase chain reaction analysis, immunoblotting and gas chromatography-mass spectrometry. These observations were verified using the in vivo multiple sclerosis model, experimental autoimmune encephalomyelitis. Brains of patients with or without multiple sclerosis demonstrated differential expression of multiple micro-RNAs, but expression of three neurosteroid synthesis enzyme-specific micro-RNAs (miR-338, miR-155 and miR-491) showed a bias towards induction in patients with multiple sclerosis (P < 0.05). Analysis of the neurosteroidogenic pathways targeted by micro-RNAs revealed suppression of enzyme transcript and protein levels in the white matter of patients with multiple sclerosis (P < 0.05). This was confirmed by firefly/Renilla luciferase micro-RNA target knockdown experiments (P < 0.05) and detection of specific micro-RNAs by in situ hybridization in the brains of patients with or without multiple sclerosis. Levels of important neurosteroids, including allopregnanolone, were suppressed in the white matter of patients with multiple sclerosis (P < 0.05). Induction of the murine micro-RNAs, miR-338 and miR-155, accompanied by diminished expression of neurosteroidogenic enzymes and allopregnanolone, was also observed in the brains of mice with experimental autoimmune encephalomyelitis (P < 0.05). Allopregnanolone treatment of the experimental autoimmune encephalomyelitis mouse model limited the associated neuropathology, including neuroinflammation, myelin and axonal injury and reduced neurobehavioral deficits (P < 0.05). These multi-platform studies point to impaired neurosteroidogenesis in both multiple sclerosis and experimental autoimmune encephalomyelitis. The findings also indicate that allopregnanolone and perhaps other neurosteroid-like compounds might represent potential biomarkers or therapies for multiple sclerosis. PMID:21908875

  8. Modeling non-linear growth responses to temperature and hydrology in wetland trees

    NASA Astrophysics Data System (ADS)

    Keim, R.; Allen, S. T.

    2016-12-01

    Growth responses of wetland trees to flooding and climate variations are difficult to model because they depend on multiple, apparently interacting factors, but are a critical link in hydrological control of wetland carbon budgets. To more generally understand tree growth to hydrological forcing, we modeled non-linear responses of tree ring growth to flooding and climate at sub-annual time steps, using Vaganov-Shashkin response functions. We calibrated the model to six baldcypress tree-ring chronologies from two hydrologically distinct sites in southern Louisiana, and tested several hypotheses of plasticity in wetlands tree responses to interacting environmental variables. The model outperformed traditional multiple linear regression. More importantly, optimized response parameters were generally similar among sites with varying hydrological conditions, suggesting generality to the functions. Model forms that included interacting responses to multiple forcing factors were more effective than were single response functions, indicating the principle of a single limiting factor is not correct in wetlands and both climatic and hydrological variables must be considered in predicting responses to hydrological or climate change.

  9. How did the platypus get its sex chromosome chain? A comparison of meiotic multiples and sex chromosomes in plants and animals.

    PubMed

    Gruetzner, Frank; Ashley, Terry; Rowell, David M; Marshall Graves, Jennifer A

    2006-04-01

    The duck-billed platypus is an extraordinary mammal. Its chromosome complement is no less extraordinary, for it includes a system in which ten sex chromosomes form an extensive meiotic chain in males. Such meiotic multiples are unprecedented in vertebrates but occur sporadically in plant and invertebrate species. In this paper, we review the evolution and formation of meiotic multiples in plants and invertebrates to try to gain insights into the origin of the platypus meiotic multiple. We describe the meiotic hurdles that translocated mammalian chromosomes face, which make longer chains disadvantageous in mammals, and we discuss how sex chromosomes and dosage compensation might have affected the evolution of sex-linked meiotic multiples. We conclude that the evolutionary conservation of the chain in monotremes, the structural properties of the translocated chromosomes and the highly accurate segregation at meiosis make the platypus system remarkably different from meiotic multiples in other species. We discuss alternative evolutionary models, which fall broadly into two categories: either the chain is the result of a sequence of translocation events from an ancestral pair of sex chromosomes (Model I) or the entire chain came into being at once by hybridization of two populations with different chromosomal rearrangements sharing monobrachial homology (Model II).

  10. Society of Thoracic Surgeons 2008 cardiac risk models predict in-hospital mortality of heart valve surgery in a Chinese population: a multicenter study.

    PubMed

    Wang, Lv; Lu, Fang-Lin; Wang, Chong; Tan, Meng-Wei; Xu, Zhi-yun

    2014-12-01

    The Society of Thoracic Surgeons 2008 cardiac surgery risk models have been developed for heart valve surgery with and without coronary artery bypass grafting. The aim of our study was to evaluate the performance of Society of Thoracic Surgeons 2008 cardiac risk models in Chinese patients undergoing single valve surgery and the predicted mortality rates of those undergoing multiple valve surgery derived from the Society of Thoracic Surgeons 2008 risk models. A total of 12,170 patients underwent heart valve surgery from January 2008 to December 2011. Combined congenital heart surgery and aortal surgery cases were excluded. A relatively small number of valve surgery combinations were excluded. The final research population included the following isolated heart valve surgery types: aortic valve replacement, mitral valve replacement, and mitral valve repair. The following combined valve surgery types were included: mitral valve replacement plus tricuspid valve repair, mitral valve replacement plus aortic valve replacement, and mitral valve replacement plus aortic valve replacement and tricuspid valve repair. Evaluation was performed by using the Hosmer-Lemeshow test and C-statistics. Data from 9846 patients were analyzed. The Society of Thoracic Surgeons 2008 cardiac risk models showed reasonable discrimination and poor calibration (C-statistic, 0.712; P = .00006 in Hosmer-Lemeshow test). Society of Thoracic Surgeons 2008 models had better discrimination (C-statistic, 0.734) and calibration (P = .5805) in patients undergoing isolated valve surgery than in patients undergoing multiple valve surgery (C-statistic, 0.694; P = .00002 in Hosmer-Lemeshow test). Estimates derived from the Society of Thoracic Surgeons 2008 models exceeded the mortality rates of multiple valve surgery (observed/expected ratios of 1.44 for multiple valve surgery and 1.17 for single valve surgery). The Society of Thoracic Surgeons 2008 cardiac surgery risk models performed well when predicting the mortality for Chinese patients undergoing valve surgery. The Society of Thoracic Surgeons 2008 models were suitable for single valve surgery in a Chinese population; estimates of mortality for multiple valve surgery derived from the Society of Thoracic Surgeons 2008 models were less accurate. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  11. Computational Models and Emergent Properties of Respiratory Neural Networks

    PubMed Central

    Lindsey, Bruce G.; Rybak, Ilya A.; Smith, Jeffrey C.

    2012-01-01

    Computational models of the neural control system for breathing in mammals provide a theoretical and computational framework bringing together experimental data obtained from different animal preparations under various experimental conditions. Many of these models were developed in parallel and iteratively with experimental studies and provided predictions guiding new experiments. This data-driven modeling approach has advanced our understanding of respiratory network architecture and neural mechanisms underlying generation of the respiratory rhythm and pattern, including their functional reorganization under different physiological conditions. Models reviewed here vary in neurobiological details and computational complexity and span multiple spatiotemporal scales of respiratory control mechanisms. Recent models describe interacting populations of respiratory neurons spatially distributed within the Bötzinger and pre-Bötzinger complexes and rostral ventrolateral medulla that contain core circuits of the respiratory central pattern generator (CPG). Network interactions within these circuits along with intrinsic rhythmogenic properties of neurons form a hierarchy of multiple rhythm generation mechanisms. The functional expression of these mechanisms is controlled by input drives from other brainstem components, including the retrotrapezoid nucleus and pons, which regulate the dynamic behavior of the core circuitry. The emerging view is that the brainstem respiratory network has rhythmogenic capabilities at multiple levels of circuit organization. This allows flexible, state-dependent expression of different neural pattern-generation mechanisms under various physiological conditions, enabling a wide repertoire of respiratory behaviors. Some models consider control of the respiratory CPG by pulmonary feedback and network reconfiguration during defensive behaviors such as cough. Future directions in modeling of the respiratory CPG are considered. PMID:23687564

  12. Efficient computation of the joint sample frequency spectra for multiple populations.

    PubMed

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  13. Efficient computation of the joint sample frequency spectra for multiple populations

    PubMed Central

    Kamm, John A.; Terhorst, Jonathan; Song, Yun S.

    2016-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248

  14. Older age, higher perceived disability and depressive symptoms predict the amount and severity of work-related difficulties in persons with multiple sclerosis.

    PubMed

    Raggi, Alberto; Giovannetti, Ambra Mara; Schiavolin, Silvia; Brambilla, Laura; Brenna, Greta; Confalonieri, Paolo Agostino; Cortese, Francesca; Frangiamore, Rita; Leonardi, Matilde; Mantegazza, Renato Emilio; Moscatelli, Marco; Ponzio, Michela; Torri Clerici, Valentina; Zaratin, Paola; De Torres, Laura

    2018-04-16

    This cross-sectional study aims to identify the predictors of work-related difficulties in a sample of employed persons with multiple sclerosis as addressed with the Multiple Sclerosis Questionnaire for Job Difficulties. Hierarchical linear regression analysis was conducted to identify predictors of work difficulties: predictors included demographic variables (age, formal education), disease duration and severity, perceived disability and psychological variables (cognitive dysfunction, depression and anxiety). The targets were the questionnaire's overall score and its six subscales. A total of 177 participants (108 females, aged 21-63) were recruited. Age, perceived disability and depression were direct and significant predictors of the questionnaire total score, and the final model explained 43.7% of its variation. The models built on the questionnaire's subscales show that perceived disability and depression were direct and significant predictors of most of its subscales. Our results show that, among patients with multiple sclerosis, those who were older, with higher perceived disability and higher depression symptoms have more and more severe work-related difficulties. The Multiple Sclerosis Questionnaire for Job Difficulties can be fruitfully exploited to plan tailored actions to limit the likelihood of near-future job loss in persons of working age with multiple sclerosis. Implications for rehabilitation Difficulties with work are common among people with multiple sclerosis and are usually addressed in terms of unemployment or job loss. The Multiple Sclerosis Questionnaire for Job Difficulties is a disease-specific questionnaire developed to address the amount and severity of work-related difficulties. We found that work-related difficulties were associated to older age, higher perceived disability and depressive symptoms. Mental health issues and perceived disability should be consistently included in future research targeting work-related difficulties.

  15. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  16. A Knowledge-Modeling Approach to Integrate Multiple Clinical Practice Guidelines to Provide Evidence-Based Clinical Decision Support for Managing Comorbid Conditions.

    PubMed

    Abidi, Samina

    2017-10-26

    Clinical management of comorbidities is a challenge, especially in a clinical decision support setting, as it requires the safe and efficient reconciliation of multiple disease-specific clinical procedures to formulate a comorbid therapeutic plan that is both effective and safe for the patient. In this paper we pursue the integration of multiple disease-specific Clinical Practice Guidelines (CPG) in order to manage co-morbidities within a computerized Clinical Decision Support System (CDSS). We present a CPG integration framework-termed as COMET (Comorbidity Ontological Modeling & ExecuTion) that manifests a knowledge management approach to model, computerize and integrate multiple CPG to yield a comorbid CPG knowledge model that upon execution can provide evidence-based recommendations for handling comorbid patients. COMET exploits semantic web technologies to achieve (a) CPG knowledge synthesis to translate a paper-based CPG to disease-specific clinical pathways (CP) that include specialized co-morbidity management procedures based on input from domain experts; (b) CPG knowledge modeling to computerize the disease-specific CP using a Comorbidity CPG ontology; (c) CPG knowledge integration by aligning multiple ontologically-modeled CP to develop a unified comorbid CPG knowledge model; and (e) CPG knowledge execution using reasoning engines to derive CPG-mediated recommendations for managing patients with comorbidities. We present a web-accessible COMET CDSS that provides family physicians with CPG-mediated comorbidity decision support to manage Atrial Fibrillation and Chronic Heart Failure. We present our qualitative and quantitative analysis of the knowledge content and usability of COMET CDSS.

  17. Modeling Complex Equilibria in ITC Experiments: Thermodynamic Parameters Estimation for a Three Binding Site Model

    PubMed Central

    Le, Vu H.; Buscaglia, Robert; Chaires, Jonathan B.; Lewis, Edwin A.

    2013-01-01

    Isothermal Titration Calorimetry, ITC, is a powerful technique that can be used to estimate a complete set of thermodynamic parameters (e.g. Keq (or ΔG), ΔH, ΔS, and n) for a ligand binding interaction described by a thermodynamic model. Thermodynamic models are constructed by combination of equilibrium constant, mass balance, and charge balance equations for the system under study. Commercial ITC instruments are supplied with software that includes a number of simple interaction models, for example one binding site, two binding sites, sequential sites, and n-independent binding sites. More complex models for example, three or more binding sites, one site with multiple binding mechanisms, linked equilibria, or equilibria involving macromolecular conformational selection through ligand binding need to be developed on a case by case basis by the ITC user. In this paper we provide an algorithm (and a link to our MATLAB program) for the non-linear regression analysis of a multiple binding site model with up to four overlapping binding equilibria. Error analysis demonstrates that fitting ITC data for multiple parameters (e.g. up to nine parameters in the three binding site model) yields thermodynamic parameters with acceptable accuracy. PMID:23262283

  18. Characterizing multiple timescales of stream and storage zone interaction that affect solute fate and transport in streams

    USGS Publications Warehouse

    Choi, Jungyill; Harvey, Judson W.; Conklin, Martha H.

    2000-01-01

    The fate of contaminants in streams and rivers is affected by exchange and biogeochemical transformation in slowly moving or stagnant flow zones that interact with rapid flow in the main channel. In a typical stream, there are multiple types of slowly moving flow zones in which exchange and transformation occur, such as stagnant or recirculating surface water as well as subsurface hyporheic zones. However, most investigators use transport models with just a single storage zone in their modeling studies, which assumes that the effects of multiple storage zones can be lumped together. Our study addressed the following question: Can a single‐storage zone model reliably characterize the effects of physical retention and biogeochemical reactions in multiple storage zones? We extended an existing stream transport model with a single storage zone to include a second storage zone. With the extended model we generated 500 data sets representing transport of nonreactive and reactive solutes in stream systems that have two different types of storage zones with variable hydrologic conditions. The one storage zone model was tested by optimizing the lumped storage parameters to achieve a best fit for each of the generated data sets. Multiple storage processes were categorized as possessing I, additive; II, competitive; or III, dominant storage zone characteristics. The classification was based on the goodness of fit of generated data sets, the degree of similarity in mean retention time of the two storage zones, and the relative distributions of exchange flux and storage capacity between the two storage zones. For most cases (>90%) the one storage zone model described either the effect of the sum of multiple storage processes (category I) or the dominant storage process (category III). Failure of the one storage zone model occurred mainly for category II, that is, when one of the storage zones had a much longer mean retention time (ts ratio > 5.0) and when the dominance of storage capacity and exchange flux occurred in different storage zones. We also used the one storage zone model to estimate a “single” lumped rate constant representing the net removal of a solute by biogeochemical reactions in multiple storage zones. For most cases the lumped rate constant that was optimized by one storage zone modeling estimated the flux‐weighted rate constant for multiple storage zones. Our results explain how the relative hydrologic properties of multiple storage zones (retention time, storage capacity, exchange flux, and biogeochemical reaction rate constant) affect the reliability of lumped parameters determined by a one storage zone transport model. We conclude that stream transport models with a single storage compartment will in most cases reliably characterize the dominant physical processes of solute retention and biogeochemical reactions in streams with multiple storage zones.

  19. Developing multiple-choices test items as tools for measuring the scientific-generic skills on solar system

    NASA Astrophysics Data System (ADS)

    Bhakti, Satria Seto; Samsudin, Achmad; Chandra, Didi Teguh; Siahaan, Parsaoran

    2017-05-01

    The aim of research is developing multiple-choices test items as tools for measuring the scientific of generic skills on solar system. To achieve the aim that the researchers used the ADDIE model consisting Of: Analyzing, Design, Development, Implementation, dan Evaluation, all of this as a method research. While The scientific of generic skills limited research to five indicator including: (1) indirect observation, (2) awareness of the scale, (3) inference logic, (4) a causal relation, and (5) mathematical modeling. The participants are 32 students at one of junior high schools in Bandung. The result shown that multiple-choices that are constructed test items have been declared valid by the expert validator, and after the tests show that the matter of developing multiple-choices test items be able to measuring the scientific of generic skills on solar system.

  20. Multiple model analysis with discriminatory data collection (MMA-DDC): A new method for improving measurement selection

    NASA Astrophysics Data System (ADS)

    Kikuchi, C.; Ferre, P. A.; Vrugt, J. A.

    2011-12-01

    Hydrologic models are developed, tested, and refined based on the ability of those models to explain available hydrologic data. The optimization of model performance based upon mismatch between model outputs and real world observations has been extensively studied. However, identification of plausible models is sensitive not only to the models themselves - including model structure and model parameters - but also to the location, timing, type, and number of observations used in model calibration. Therefore, careful selection of hydrologic observations has the potential to significantly improve the performance of hydrologic models. In this research, we seek to reduce prediction uncertainty through optimization of the data collection process. A new tool - multiple model analysis with discriminatory data collection (MMA-DDC) - was developed to address this challenge. In this approach, multiple hydrologic models are developed and treated as competing hypotheses. Potential new data are then evaluated on their ability to discriminate between competing hypotheses. MMA-DDC is well-suited for use in recursive mode, in which new observations are continuously used in the optimization of subsequent observations. This new approach was applied to a synthetic solute transport experiment, in which ranges of parameter values constitute the multiple hydrologic models, and model predictions are calculated using likelihood-weighted model averaging. MMA-DDC was used to determine the optimal location, timing, number, and type of new observations. From comparison with an exhaustive search of all possible observation sequences, we find that MMA-DDC consistently selects observations which lead to the highest reduction in model prediction uncertainty. We conclude that using MMA-DDC to evaluate potential observations may significantly improve the performance of hydrologic models while reducing the cost associated with collecting new data.

  1. Consistent radiative transfer modeling of active and passive observations of precipitation

    NASA Astrophysics Data System (ADS)

    Adams, Ian

    2016-04-01

    Spaceborne platforms such as the Tropical Rainfall Measurement Mission (TRMM) and the Global Precipitation Measurement (GPM) mission exploit a combination of active and passive sensors to provide a greater understanding of the three-dimensional structure of precipitation. While "operationalized" retrieval algorithms require fast forward models, the ability to perform higher fidelity simulations is necessary in order to understand the physics of remote sensing problems by testing assumptions and developing parameterizations for the fast models. To ensure proper synergy between active and passive modeling, forward models must be consistent when modeling the responses of radars and radiometers. This work presents a self-consistent transfer model for simulating radar reflectivities and millimeter wave brightness temperatures for precipitating scenes. To accomplish this, we extended the Atmospheric Radiative Transfer Simulator (ARTS) version 2.3 to solve the radiative transfer equation for active sensors and multiple scattering conditions. Early versions of ARTS (1.1) included a passive Monte Carlo solver, and ARTS is capable of handling atmospheres of up to three dimensions with ellipsoidal planetary geometries. The modular nature of ARTS facilitates extensibility, and the well-developed ray-tracing tools are suited for implementation of Monte Carlo algorithms. Finally, since ARTS handles the full Stokes vector, co- and cross-polarized reflectivity products are possible for scenarios that include nonspherical particles, with or without preferential alignment. The accuracy of the forward model will be demonstrated with precipitation events observed by TRMM and GPM, and the effects of multiple scattering will be detailed. The three-dimensional nature of the radiative transfer model will be useful for understanding the effects of nonuniform beamfill and multiple scattering for spatially heterogeneous precipitation events. The targets of this forward model are GPM (the Dual-wavelength Precipitation Radar (DPR) and GPM Microwave Imager (GMI)).

  2. Managing data from multiple disciplines, scales, and sites to support synthesis and modeling

    USGS Publications Warehouse

    Olson, R. J.; Briggs, J. M.; Porter, J.H.; Mah, Grant R.; Stafford, S.G.

    1999-01-01

    The synthesis and modeling of ecological processes at multiple spatial and temporal scales involves bringing together and sharing data from numerous sources. This article describes a data and information system model that facilitates assembling, managing, and sharing diverse data from multiple disciplines, scales, and sites to support integrated ecological studies. Cross-site scientific-domain working groups coordinate the development of data associated with their particular scientific working group, including decisions about data requirements, data to be compiled, data formats, derived data products, and schedules across the sites. The Web-based data and information system consists of nodes for each working group plus a central node that provides data access, project information, data query, and other functionality. The approach incorporates scientists and computer experts in the working groups and provides incentives for individuals to submit documented data to the data and information system.

  3. Computing Fault Displacements from Surface Deformations

    NASA Technical Reports Server (NTRS)

    Lyzenga, Gregory; Parker, Jay; Donnellan, Andrea; Panero, Wendy

    2006-01-01

    Simplex is a computer program that calculates locations and displacements of subterranean faults from data on Earth-surface deformations. The calculation involves inversion of a forward model (given a point source representing a fault, a forward model calculates the surface deformations) for displacements, and strains caused by a fault located in isotropic, elastic half-space. The inversion involves the use of nonlinear, multiparameter estimation techniques. The input surface-deformation data can be in multiple formats, with absolute or differential positioning. The input data can be derived from multiple sources, including interferometric synthetic-aperture radar, the Global Positioning System, and strain meters. Parameters can be constrained or free. Estimates can be calculated for single or multiple faults. Estimates of parameters are accompanied by reports of their covariances and uncertainties. Simplex has been tested extensively against forward models and against other means of inverting geodetic data and seismic observations. This work

  4. Escaping the snare of chronological growth and launching a free curve alternative: general deviance as latent growth model.

    PubMed

    Wood, Phillip Karl; Jackson, Kristina M

    2013-08-01

    Researchers studying longitudinal relationships among multiple problem behaviors sometimes characterize autoregressive relationships across constructs as indicating "protective" or "launch" factors or as "developmental snares." These terms are used to indicate that initial or intermediary states of one problem behavior subsequently inhibit or promote some other problem behavior. Such models are contrasted with models of "general deviance" over time in which all problem behaviors are viewed as indicators of a common linear trajectory. When fit of the "general deviance" model is poor and fit of one or more autoregressive models is good, this is taken as support for the inhibitory or enhancing effect of one construct on another. In this paper, we argue that researchers consider competing models of growth before comparing deviance and time-bound models. Specifically, we propose use of the free curve slope intercept (FCSI) growth model (Meredith & Tisak, 1990) as a general model to typify change in a construct over time. The FCSI model includes, as nested special cases, several statistical models often used for prospective data, such as linear slope intercept models, repeated measures multivariate analysis of variance, various one-factor models, and hierarchical linear models. When considering models involving multiple constructs, we argue the construct of "general deviance" can be expressed as a single-trait multimethod model, permitting a characterization of the deviance construct over time without requiring restrictive assumptions about the form of growth over time. As an example, prospective assessments of problem behaviors from the Dunedin Multidisciplinary Health and Development Study (Silva & Stanton, 1996) are considered and contrasted with earlier analyses of Hussong, Curran, Moffitt, and Caspi (2008), which supported launch and snare hypotheses. For antisocial behavior, the FCSI model fit better than other models, including the linear chronometric growth curve model used by Hussong et al. For models including multiple constructs, a general deviance model involving a single trait and multimethod factors (or a corresponding hierarchical factor model) fit the data better than either the "snares" alternatives or the general deviance model previously considered by Hussong et al. Taken together, the analyses support the view that linkages and turning points cannot be contrasted with general deviance models absent additional experimental intervention or control.

  5. Escaping the snare of chronological growth and launching a free curve alternative: General deviance as latent growth model

    PubMed Central

    WOOD, PHILLIP KARL; JACKSON, KRISTINA M.

    2014-01-01

    Researchers studying longitudinal relationships among multiple problem behaviors sometimes characterize autoregressive relationships across constructs as indicating “protective” or “launch” factors or as “developmental snares.” These terms are used to indicate that initial or intermediary states of one problem behavior subsequently inhibit or promote some other problem behavior. Such models are contrasted with models of “general deviance” over time in which all problem behaviors are viewed as indicators of a common linear trajectory. When fit of the “general deviance” model is poor and fit of one or more autoregressive models is good, this is taken as support for the inhibitory or enhancing effect of one construct on another. In this paper, we argue that researchers consider competing models of growth before comparing deviance and time-bound models. Specifically, we propose use of the free curve slope intercept (FCSI) growth model (Meredith & Tisak, 1990) as a general model to typify change in a construct over time. The FCSI model includes, as nested special cases, several statistical models often used for prospective data, such as linear slope intercept models, repeated measures multivariate analysis of variance, various one-factor models, and hierarchical linear models. When considering models involving multiple constructs, we argue the construct of “general deviance” can be expressed as a single-trait multimethod model, permitting a characterization of the deviance construct over time without requiring restrictive assumptions about the form of growth over time. As an example, prospective assessments of problem behaviors from the Dunedin Multidisciplinary Health and Development Study (Silva & Stanton, 1996) are considered and contrasted with earlier analyses of Hussong, Curran, Moffitt, and Caspi (2008), which supported launch and snare hypotheses. For antisocial behavior, the FCSI model fit better than other models, including the linear chronometric growth curve model used by Hussong et al. For models including multiple constructs, a general deviance model involving a single trait and multimethod factors (or a corresponding hierarchical factor model) fit the data better than either the “snares” alternatives or the general deviance model previously considered by Hussong et al. Taken together, the analyses support the view that linkages and turning points cannot be contrasted with general deviance models absent additional experimental intervention or control. PMID:23880389

  6. EPA's SHEDS-multimedia model: children's cumulative pyrethroid exposure estimates and evaluation against NHANES biomarker data

    EPA Science Inventory

    The U.S. EPA's SHEDS-Multimedia model was applied to enhance the understanding of children's exposures and doses to multiple pyrethroid pesticides, including major contributing chemicals and pathways. This paper presents combined dietary and residential exposure estimates and cum...

  7. Final state interactions and inclusive nuclear collisions

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Dubey, Rajendra R.

    1993-01-01

    A scattering formalism is developed in a multiple scattering model to describe inclusive momentum distributions for high-energy projectiles. The effects of final state interactions on response functions and momentum distributions are investigated. Calculations for high-energy protons that include shell model response functions are compared with experiments.

  8. Modelling in forest management

    Treesearch

    Mark J. Twery

    2004-01-01

    Forest management has traditionally been considered management of trees for timber. It really includes vegetation management and land management and people management as multiple objectives. As such, forest management is intimately linked with other topics in this volume, most especially those chapters on ecological modelling and human dimensions. The key to...

  9. Predicting daily use of urban forest recreation sites

    Treesearch

    John F. Dwyer

    1988-01-01

    A multiple linear regression model explains 90% of the variance in daily use of an urban recreation site. Explanatory variables include season, day of the week, and weather. The results offer guides for recreation site planning and management as well as suggestions for improving the model.

  10. Default Bayes Factors for Model Selection in Regression

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Morey, Richard D.

    2012-01-01

    In this article, we present a Bayes factor solution for inference in multiple regression. Bayes factors are principled measures of the relative evidence from data for various models or positions, including models that embed null hypotheses. In this regard, they may be used to state positive evidence for a lack of an effect, which is not possible…

  11. Merging for Particle-Mesh Complex Particle Kinetic Modeling of the Multiple Plasma Beams

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.

    2011-01-01

    We suggest a merging procedure for the Particle-Mesh Complex Particle Kinetic (PMCPK) method in case of inter-penetrating flow (multiple plasma beams). We examine the standard particle-in-cell (PIC) and the PMCPK methods in the case of particle acceleration by shock surfing for a wide range of the control numerical parameters. The plasma dynamics is described by a hybrid (particle-ion-fluid-electron) model. Note that one may need a mesh if modeling with the computation of an electromagnetic field. Our calculations use specified, time-independent electromagnetic fields for the shock, rather than self-consistently generated fields. While a particle-mesh method is a well-verified approach, the CPK method seems to be a good approach for multiscale modeling that includes multiple regions with various particle/fluid plasma behavior. However, the CPK method is still in need of a verification for studying the basic plasma phenomena: particle heating and acceleration by collisionless shocks, magnetic field reconnection, beam dynamics, etc.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keefer, Donald A.; Shaffer, Eric G.; Storsved, Brynne

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing jointmore » visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  13. RVA: A Plugin for ParaView 3.14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-09-04

    RVA is a plugin developed for the 64-bit Windows version of the ParaView 3.14 visualization package. RVA is designed to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed onmore » enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  14. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  15. Universal core model for multiple-gate field-effect transistors with short channel and quantum mechanical effects

    NASA Astrophysics Data System (ADS)

    Shin, Yong Hyeon; Bae, Min Soo; Park, Chuntaek; Park, Joung Won; Park, Hyunwoo; Lee, Yong Ju; Yun, Ilgu

    2018-06-01

    A universal core model for multiple-gate (MG) field-effect transistors (FETs) with short channel effects (SCEs) and quantum mechanical effects (QMEs) is proposed. By using a Young’s approximation based solution for one-dimensional Poisson’s equations the total inversion charge density (Q inv ) in the channel is modeled for double-gate (DG) and surrounding-gate SG (SG) FETs, following which a universal charge model is derived based on the similarity of the solutions, including for quadruple-gate (QG) FETs. For triple-gate (TG) FETs, the average of DG and QG FETs are used. A SCEs model is also proposed considering the potential difference between the channel’s surface and center. Finally, a QMEs model for MG FETs is developed using the quantum correction compact model. The proposed universal core model is validated on commercially available three-dimensional ATLAS numerical simulations.

  16. Classification techniques on computerized systems to predict and/or to detect Apnea: A systematic review.

    PubMed

    Pombo, Nuno; Garcia, Nuno; Bousson, Kouamana

    2017-03-01

    Sleep apnea syndrome (SAS), which can significantly decrease the quality of life is associated with a major risk factor of health implications such as increased cardiovascular disease, sudden death, depression, irritability, hypertension, and learning difficulties. Thus, it is relevant and timely to present a systematic review describing significant applications in the framework of computational intelligence-based SAS, including its performance, beneficial and challenging effects, and modeling for the decision-making on multiple scenarios. This study aims to systematically review the literature on systems for the detection and/or prediction of apnea events using a classification model. Forty-five included studies revealed a combination of classification techniques for the diagnosis of apnea, such as threshold-based (14.75%) and machine learning (ML) models (85.25%). In addition, the ML models, were clustered in a mind map, include neural networks (44.26%), regression (4.91%), instance-based (11.47%), Bayesian algorithms (1.63%), reinforcement learning (4.91%), dimensionality reduction (8.19%), ensemble learning (6.55%), and decision trees (3.27%). A classification model should provide an auto-adaptive and no external-human action dependency. In addition, the accuracy of the classification models is related with the effective features selection. New high-quality studies based on randomized controlled trials and validation of models using a large and multiple sample of data are recommended. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  17. Multiple-output support vector machine regression with feature selection for arousal/valence space emotion assessment.

    PubMed

    Torres-Valencia, Cristian A; Álvarez, Mauricio A; Orozco-Gutiérrez, Alvaro A

    2014-01-01

    Human emotion recognition (HER) allows the assessment of an affective state of a subject. Until recently, such emotional states were described in terms of discrete emotions, like happiness or contempt. In order to cover a high range of emotions, researchers in the field have introduced different dimensional spaces for emotion description that allow the characterization of affective states in terms of several variables or dimensions that measure distinct aspects of the emotion. One of the most common of such dimensional spaces is the bidimensional Arousal/Valence space. To the best of our knowledge, all HER systems so far have modelled independently, the dimensions in these dimensional spaces. In this paper, we study the effect of modelling the output dimensions simultaneously and show experimentally the advantages in modeling them in this way. We consider a multimodal approach by including features from the Electroencephalogram and a few physiological signals. For modelling the multiple outputs, we employ a multiple output regressor based on support vector machines. We also include an stage of feature selection that is developed within an embedded approach known as Recursive Feature Elimination (RFE), proposed initially for SVM. The results show that several features can be eliminated using the multiple output support vector regressor with RFE without affecting the performance of the regressor. From the analysis of the features selected in smaller subsets via RFE, it can be observed that the signals that are more informative into the arousal and valence space discrimination are the EEG, Electrooculogram/Electromiogram (EOG/EMG) and the Galvanic Skin Response (GSR).

  18. A Solution to the Cosmic Conundrum including Cosmological Constant and Dark Energy Problems

    NASA Astrophysics Data System (ADS)

    Singh, A.

    2009-12-01

    A comprehensive solution to the cosmic conundrum is presented that also resolves key paradoxes of quantum mechanics and relativity. A simple mathematical model, the Gravity Nullification model (GNM), is proposed that integrates the missing physics of the spontaneous relativistic conversion of mass to energy into the existing physics theories, specifically a simplified general theory of relativity. Mechanistic mathematical expressions are derived for a relativistic universe expansion, which predict both the observed linear Hubble expansion in the nearby universe and the accelerating expansion exhibited by the supernova observations. The integrated model addresses the key questions haunting physics and Big Bang cosmology. It also provides a fresh perspective on the misconceived birth and evolution of the universe, especially the creation and dissolution of matter. The proposed model eliminates singularities from existing models and the need for the incredible and unverifiable assumptions including the superluminous inflation scenario, multiple universes, multiple dimensions, Anthropic principle, and quantum gravity. GNM predicts the observed features of the universe without any explicit consideration of time as a governing parameter.

  19. Practical 3-D Beam Pattern Based Channel Modeling for Multi-Polarized Massive MIMO Systems.

    PubMed

    Aghaeinezhadfirouzja, Saeid; Liu, Hui; Balador, Ali

    2018-04-12

    In this paper, a practical non-stationary three-dimensional (3-D) channel models for massive multiple-input multiple-output (MIMO) systems, considering beam patterns for different antenna elements, is proposed. The beam patterns using dipole antenna elements with different phase excitation toward the different direction of travels (DoTs) contributes various correlation weights for rays related towards/from the cluster, thus providing different elevation angle of arrivals (EAoAs) and elevation angle of departures (EAoDs) for each antenna element. These include the movements of the user that makes our channel to be a non-stationary model of clusters at the receiver (RX) on both the time and array axes. In addition, their impacts on 3-D massive MIMO channels are investigated via statistical properties including received spatial correlation. Additionally, the impact of elevation/azimuth angles of arrival on received spatial correlation is discussed. Furthermore, experimental validation of the proposed 3-D channel models on azimuth and elevation angles of the polarized antenna are specifically evaluated and compared through simulations. The proposed 3-D generic models are verified using relevant measurement data.

  20. Practical 3-D Beam Pattern Based Channel Modeling for Multi-Polarized Massive MIMO Systems †

    PubMed Central

    Aghaeinezhadfirouzja, Saeid; Liu, Hui

    2018-01-01

    In this paper, a practical non-stationary three-dimensional (3-D) channel models for massive multiple-input multiple-output (MIMO) systems, considering beam patterns for different antenna elements, is proposed. The beam patterns using dipole antenna elements with different phase excitation toward the different direction of travels (DoTs) contributes various correlation weights for rays related towards/from the cluster, thus providing different elevation angle of arrivals (EAoAs) and elevation angle of departures (EAoDs) for each antenna element. These include the movements of the user that makes our channel to be a non-stationary model of clusters at the receiver (RX) on both the time and array axes. In addition, their impacts on 3-D massive MIMO channels are investigated via statistical properties including received spatial correlation. Additionally, the impact of elevation/azimuth angles of arrival on received spatial correlation is discussed. Furthermore, experimental validation of the proposed 3-D channel models on azimuth and elevation angles of the polarized antenna are specifically evaluated and compared through simulations. The proposed 3-D generic models are verified using relevant measurement data. PMID:29649177

  1. Multicollinearity is a red herring in the search for moderator variables: A guide to interpreting moderated multiple regression models and a critique of Iacobucci, Schneider, Popovich, and Bakamitsos (2016).

    PubMed

    McClelland, Gary H; Irwin, Julie R; Disatnik, David; Sivan, Liron

    2017-02-01

    Multicollinearity is irrelevant to the search for moderator variables, contrary to the implications of Iacobucci, Schneider, Popovich, and Bakamitsos (Behavior Research Methods, 2016, this issue). Multicollinearity is like the red herring in a mystery novel that distracts the statistical detective from the pursuit of a true moderator relationship. We show multicollinearity is completely irrelevant for tests of moderator variables. Furthermore, readers of Iacobucci et al. might be confused by a number of their errors. We note those errors, but more positively, we describe a variety of methods researchers might use to test and interpret their moderated multiple regression models, including two-stage testing, mean-centering, spotlighting, orthogonalizing, and floodlighting without regard to putative issues of multicollinearity. We cite a number of recent studies in the psychological literature in which the researchers used these methods appropriately to test, to interpret, and to report their moderated multiple regression models. We conclude with a set of recommendations for the analysis and reporting of moderated multiple regression that should help researchers better understand their models and facilitate generalizations across studies.

  2. Efficient exploration of pan-cancer networks by generalized covariance selection and interactive web content

    PubMed Central

    Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven

    2015-01-01

    Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855

  3. Children's representations of multiple family relationships: organizational structure and development in early childhood.

    PubMed

    Schermerhorn, Alice C; Cummings, E Mark; Davies, Patrick T

    2008-02-01

    The authors examine mutual family influence processes at the level of children's representations of multiple family relationships, as well as the structure of those representations. From a community sample with 3 waves, each spaced 1 year apart, kindergarten-age children (105 boys and 127 girls) completed a story-stem completion task, tapping representations of multiple family relationships. Structural equation modeling with autoregressive controls indicated that representational processes involving different family relationships were interrelated over time, including links between children's representations of marital conflict and reactions to conflict, between representations of security about marital conflict and parent-child relationships, and between representations of security in father-child and mother-child relationships. Mixed support was found for notions of increasing stability in representations during this developmental period. Results are discussed in terms of notions of transactional family dynamics, including family-wide perspectives on mutual influence processes attributable to multiple family relationships.

  4. The Water Erosion Prediction Project (WEPP) model for saturation excess conditions: application to an agricultural and a forested watershed.

    NASA Astrophysics Data System (ADS)

    Crabtree, B.; Brooks, E.; Ostrowski, K.; Elliot, W. J.; Boll, J.

    2006-12-01

    We incorporated saturation excess overland flow processes in the Water Erosion Prediction Project (WEPP) model for the evaluation of human disturbances in watersheds. In this presentation, we present results of the modified WEPP model to two watersheds: an agricultural watershed with mixed land use, and a forested watershed. The agricultural watershed is Paradise Creek, an intensively monitored watershed with continuous climate, flow and sediment data collection at multiple locations. Restoration efforts in Paradise Creek watershed include changing to minimal tillage or no-tillage sytems, and implementation of structural practices. The forested watershed is the 28 km2 Mica Creek Experimental Watershed (MCEW) where disturbances include clear and partial cutting, and road building. The MCEW has a nested study design, which allows for the analysis of cumulative effects as well as the traditional comparison of treatment versus control. Mica Creek watershed is a high elevation watershed where streamflow is generated mostly by snowmelt. Treatments include road building in 1997, and clearcut and partial-cut logging in 2001. Our results include the simulation of streamflow and sediment delivery at multiple locations within each watershed, and evaluation of the human disturbances.

  5. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes

    PubMed Central

    2014-01-01

    Background Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. Methods The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Results Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on outcomes not directly considered by the studies included in the analysis. Conclusions Accounting for the dependency between outcomes in a multivariate meta-analysis may or may not improve the precision of effect estimates from a network meta-analysis compared to analysing each outcome separately. PMID:25047164

  6. The Swedish version of the Acceptance of Chronic Health Conditions Scale for people with multiple sclerosis: Translation, cultural adaptation and psychometric properties.

    PubMed

    Forslin, Mia; Kottorp, Anders; Kierkegaard, Marie; Johansson, Sverker

    2016-11-11

    To translate and culturally adapt the Acceptance of Chronic Health Conditions (ACHC) Scale for people with multiple sclerosis into Swedish, and to analyse the psychometric properties of the Swedish version. Ten people with multiple sclerosis participated in translation and cultural adaptation of the ACHC Scale; 148 people with multiple sclerosis were included in evaluation of the psychometric properties of the scale. Translation and cultural adaptation were carried out through translation and back-translation, by expert committee evaluation and pre-test with cognitive interviews in people with multiple sclerosis. The psychometric properties of the Swedish version were evaluated using Rasch analysis. The Swedish version of the ACHC Scale was an acceptable equivalent to the original version. Seven of the original 10 items fitted the Rasch model and demonstrated ability to separate between groups. A 5-item version, including 2 items and 3 super-items, demonstrated better psychometric properties, but lower ability to separate between groups. The Swedish version of the ACHC Scale with the original 10 items did not fit the Rasch model. Two solutions, either with 7 items (ACHC-7) or with 2 items and 3 super-items (ACHC-5), demonstrated acceptable psychometric properties. Use of the ACHC-5 Scale with super-items is recommended, since this solution adjusts for local dependency among items.

  7. Multiple-time scales analysis of physiological time series under neural control

    NASA Technical Reports Server (NTRS)

    Peng, C. K.; Hausdorff, J. M.; Havlin, S.; Mietus, J. E.; Stanley, H. E.; Goldberger, A. L.

    1998-01-01

    We discuss multiple-time scale properties of neurophysiological control mechanisms, using heart rate and gait regulation as model systems. We find that scaling exponents can be used as prognostic indicators. Furthermore, detection of more subtle degradation of scaling properties may provide a novel early warning system in subjects with a variety of pathologies including those at high risk of sudden death.

  8. Aspects of porosity prediction using multivariate linear regression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrnes, A.P.; Wilson, M.D.

    1991-03-01

    Highly accurate multiple linear regression models have been developed for sandstones of diverse compositions. Porosity reduction or enhancement processes are controlled by the fundamental variables, Pressure (P), Temperature (T), Time (t), and Composition (X), where composition includes mineralogy, size, sorting, fluid composition, etc. The multiple linear regression equation, of which all linear porosity prediction models are subsets, takes the generalized form: Porosity = C{sub 0} + C{sub 1}(P) + C{sub 2}(T) + C{sub 3}(X) + C{sub 4}(t) + C{sub 5}(PT) + C{sub 6}(PX) + C{sub 7}(Pt) + C{sub 8}(TX) + C{sub 9}(Tt) + C{sub 10}(Xt) + C{sub 11}(PTX) + C{submore » 12}(PXt) + C{sub 13}(PTt) + C{sub 14}(TXt) + C{sub 15}(PTXt). The first four primary variables are often interactive, thus requiring terms involving two or more primary variables (the form shown implies interaction and not necessarily multiplication). The final terms used may also involve simple mathematic transforms such as log X, e{sup T}, X{sup 2}, or more complex transformations such as the Time-Temperature Index (TTI). The X term in the equation above represents a suite of compositional variable and, therefore, a fully expanded equation may include a series of terms incorporating these variables. Numerous published bivariate porosity prediction models involving P (or depth) or Tt (TTI) are effective to a degree, largely because of the high degree of colinearity between p and TTI. However, all such bivariate models ignore the unique contributions of P and Tt, as well as various X terms. These simpler models become poor predictors in regions where colinear relations change, were important variables have been ignored, or where the database does not include a sufficient range or weight distribution for the critical variables.« less

  9. Locally adaptive, spatially explicit projection of US population for 2030 and 2050.

    PubMed

    McKee, Jacob J; Rose, Amy N; Bright, Edward A; Huynh, Timmy; Bhaduri, Budhendra L

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Building on the spatial interpolation technique previously developed for high-resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically informed spatial distribution of projected population of the contiguous United States for 2030 and 2050, depicting one of many possible population futures. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection model departs from these by accounting for multiple components that affect population distribution. Modeled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the US Census's projection methodology, with the US Census's official projection as the benchmark. Applications of our model include incorporating multiple various scenario-driven events to produce a range of spatially explicit population futures for suitability modeling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.

  10. Childhood trauma and eating psychopathology: a mediating role for dissociation and emotion dysregulation?

    PubMed

    Moulton, Stuart J; Newman, Emily; Power, Kevin; Swanson, Vivien; Day, Kenny

    2015-01-01

    The present study examined the relationship between different forms of childhood trauma and eating psychopathology using a multiple mediation model that included emotion dysregulation and dissociation as hypothesised mediators. 142 female undergraduate psychology students studying at two British Universities participated in this cross-sectional study. Participants completed measures of childhood trauma (emotional abuse, physical abuse, sexual abuse, emotional neglect and physical neglect), eating psychopathology, dissociation and emotion dysregulation. Multiple mediation analysis was conducted to investigate the study's proposed model. Results revealed that the multiple mediation model significantly predicted eating psychopathology. Additionally, both emotion dysregulation and dissociation were found to be significant mediators between childhood trauma and eating psychopathology. A specific indirect effect was observed between childhood emotional abuse and eating psychopathology through emotion dysregulation. Findings support previous research linking childhood trauma to eating psychopathology. They indicate that multiple forms of childhood trauma should be assessed for individuals with eating disorders. The possible maintaining role of emotion regulation processes should also be considered in the treatment of eating disorders. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Theoretical Studies on InGaAs/InAlAs SAGCM Avalanche Photodiodes

    NASA Astrophysics Data System (ADS)

    Cao, Siyu; Zhao, Yue; ur Rehman, Sajid; Feng, Shuai; Zuo, Yuhua; Li, Chuanbo; Zhang, Lichun; Cheng, Buwen; Wang, Qiming

    2018-05-01

    In this paper, we provide a detailed insight on InGaAs/InAlAs separate absorption, grading, charge, and multiplication avalanche photodiodes (SAGCM APDs) and a theoretical model of APDs is built. Through theoretical analysis and two-dimensional (2D) simulation, the influence of charge layer and tunneling effect on the APDs is fully understood. The design of charge layer (including doping level and thickness) can be calculated by our predictive model for different multiplication thickness. We find that as the thickness of charge layer increases, the suitable doping level range in charge layer decreases. Compared to thinner charge layer, performance of APD varies significantly via several percent deviations of doping concentrations in thicker charge layer. Moreover, the generation rate ( G btt ) of band-to-band tunnel is calculated, and the influence of tunneling effect on avalanche field was analyzed. We confirm that avalanche field and multiplication factor ( M n ) in multiplication will decrease by the tunneling effect. The theoretical model and analysis are based on InGaAs/InAlAs APD; however, they are applicable to other APD material systems as well.

  12. Comparison of CAM-Chem with Trace Gas Measurements from Airborne Field Campaigns from 2009-2016.

    NASA Astrophysics Data System (ADS)

    Schauffler, S.; Atlas, E. L.; Kinnison, D. E.; Lamarque, J. F.; Saiz-Lopez, A.; Navarro, M. A.; Donets, V.; Blake, D. R.; Blake, N. J.

    2016-12-01

    Trace gas measurements collected during seven field campaigns, two with multiple deployments, will be compared with the NCAR CAM-Chem model to evaluate the model performance over multiple years. The campaigns include HIPPO (2009-2011) pole to pole observations in the Pacific on the NSF/NCAR GV over multiple seasons; SEAC4RS (Aug./Sept., 2013) in the central and southern U.S. and western Gulf of Mexico on the NASA ER-2 and DC8; ATTREX (2011-2015) on the NASA Global Hawk over multiple seasons and locations; CONTRAST (Jan/Feb, 2014) in the western Pacific on the NSF/NCAR GV; VIRGAS (Oct., 2015) in the south central US and western Gulf of Mexico on the NASA WB-57; ORCAS (Jan/Feb, 2016) over the southern ocean on the NSF/NCAR GV; and POSIDON (Oct, 2016) in the western Pacific on the NASA WB-57. We will focus on along the flight tracks comparisons with the model and will also examine comparisons of vertical distributions and various tracer-tracer correlations.

  13. Effect of Multiple Scattering on the Compton Recoil Current Generated in an EMP, Revisited

    DOE PAGES

    Farmer, William A.; Friedman, Alex

    2015-06-18

    Multiple scattering has historically been treated in EMP modeling through the obliquity factor. The validity of this approach is examined here. A simplified model problem, which correctly captures cyclotron motion, Doppler shifting due to the electron motion, and multiple scattering is first considered. The simplified problem is solved three ways: the obliquity factor, Monte-Carlo, and Fokker-Planck finite-difference. Because of the Doppler effect, skewness occurs in the distribution. It is demonstrated that the obliquity factor does not correctly capture this skewness, but the Monte-Carlo and Fokker-Planck finite-difference approaches do. Here, the obliquity factor and Fokker-Planck finite-difference approaches are then compared inmore » a fuller treatment, which includes the initial Klein-Nishina distribution of the electrons, and the momentum dependence of both drag and scattering. It is found that, in general, the obliquity factor is adequate for most situations. However, as the gamma energy increases and the Klein-Nishina becomes more peaked in the forward direction, skewness in the distribution causes greater disagreement between the obliquity factor and a more accurate model of multiple scattering.« less

  14. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  15. Faculty Sufficiency and AACSB Accreditation Compliance within a Global University: A Mathematical Modeling Approach

    ERIC Educational Resources Information Center

    Boronico, Jess; Murdy, Jim; Kong, Xinlu

    2014-01-01

    This manuscript proposes a mathematical model to address faculty sufficiency requirements towards assuring overall high quality management education at a global university. Constraining elements include full-time faculty coverage by discipline, location, and program, across multiple campus locations subject to stated service quality standards of…

  16. Project Physics Tests 5, Models of the Atom.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    Test items relating to Project Physics Unit 5 are presented in this booklet. Included are 70 multiple-choice and 23 problem-and-essay questions. Concepts of atomic model are examined on aspects of relativistic corrections, electron emission, photoelectric effects, Compton effect, quantum theories, electrolysis experiments, atomic number and mass,…

  17. Examining the Bricks and Mortar of Socioeconomic Status: An Empirical Comparison of Measurement Methods

    ERIC Educational Resources Information Center

    Markle, Ross Edward

    2010-01-01

    The impact of socioeconomic status (SES) on educational outcomes has been widely demonstrated in the fields of sociology, psychology, and educational research. Across these fields however, measurement models of SES vary, including single indicators (parental income, education, and occupation), multiple indicators, hierarchical models, and most…

  18. DEVELOPMENT AND PEER REVIEW OF TIME-TO-EFFECT MODELS FOR THE ANALYSIS OF NEUROTOXICITY AND OTHER TIME DEPENDENT DATA

    EPA Science Inventory

    Neurobehavioral studies pose unique challenges for dose-response modeling, including small sample size and relatively large intra-subject variation, repeated measurements over time, multiple endpoints with both continuous and ordinal scales, and time dependence of risk characteri...

  19. Creating Shared Value through Service-Learning in Management Education

    ERIC Educational Resources Information Center

    Nikolova, Natalia; Andersen, Lisa

    2017-01-01

    Service-learning has gained strong interest among educators as a model of experiential education through community engagement. Its potential to contribute to multiple stakeholders, including students, community partners, faculty, and university, is well recognized. While research has focused on elements of this teaching model that contribute to…

  20. An Evaluation of Curriculum Materials Based Upon the Socio-Scientific Reasoning Model.

    ERIC Educational Resources Information Center

    Henkin, Gayle; And Others

    To address the need to develop a scientifically literate citizenry, the socio-scientific reasoning model was created to guide curriculum development. Goals of this developmental approach include increasing: (1) students' skills in dealing with problems containing multiple interacting variables; (2) students' decision-making skills incorporating a…

  1. Examining Multiple Parenting Behaviors on Young Children's Dietary Fat Consumption

    ERIC Educational Resources Information Center

    Eisenberg, Christina M.; Ayala, Guadalupe X.; Crespo, Noe C.; Lopez, Nanette V.; Zive, Michelle Murphy; Corder, Kirsten; Wood, Christine; Elder, John P.

    2012-01-01

    Objective: To understand the association between parenting and children's dietary fat consumption, this study tested a comprehensive model of parenting that included parent household rules, parent modeling of rules, parent mediated behaviors, and parent support. Design: Cross-sectional. Setting: Baseline data from the "MOVE/me Muevo"…

  2. Purpose and Pedagogy: A Conceptual Model for an ePortfolio

    ERIC Educational Resources Information Center

    Buyarski, Catherine A.; Aaron, Robert W.; Hansen, Michele J.; Hollingsworth, Cynthia D.; Johnson, Charles A.; Kahn, Susan; Landis, Cynthia M.; Pedersen, Joan S.; Powell, Amy A.

    2015-01-01

    This conceptual model emerged from the need to balance multiple purposes and perspectives associated with developing an ePortfolio designed to promote student development and success. A comprehensive review of literature from various disciplines, theoretical frameworks, and scholarship, including self-authorship, reflection, ePortfolio pedagogy,…

  3. The importance of data curation on QSAR Modeling - PHYSPROP open data as a case study. (QSAR 2016)

    EPA Science Inventory

    During the last few decades many QSAR models and tools have been developed at the US EPA, including the widely used EPISuite. During this period the arsenal of computational capabilities supporting cheminformatics has broadened dramatically with multiple software packages. These ...

  4. Rethinking Validation in Complex High-Stakes Assessment Contexts

    ERIC Educational Resources Information Center

    Koch, Martha J.; DeLuca, Christopher

    2012-01-01

    In this article we rethink validation within the complex contexts of high-stakes assessment. We begin by considering the utility of existing models for validation and argue that these models tend to overlook some of the complexities inherent to assessment use, including the multiple interpretations of assessment purposes and the potential…

  5. Models, Measurements, and Local Decisions: Assessing and ...

    EPA Pesticide Factsheets

    This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  6. An investigation of multidisciplinary complex health care interventions - steps towards an integrative treatment model in the rehabilitation of People with Multiple Sclerosis

    PubMed Central

    2012-01-01

    Background The Danish Multiple Sclerosis Society initiated a large-scale bridge building and integrative treatment project to take place from 2004–2010 at a specialized Multiple Sclerosis (MS) hospital. In this project, a team of five conventional health care practitioners and five alternative practitioners was set up to work together in developing and offering individualized treatments to 200 people with MS. The purpose of this paper is to present results from the six year treatment collaboration process regarding the development of an integrative treatment model. Discussion The collaborative work towards an integrative treatment model for people with MS, involved six steps: 1) Working with an initial model 2) Unfolding the different treatment philosophies 3) Discussing the elements of the Intervention-Mechanism-Context-Outcome-scheme (the IMCO-scheme) 4) Phrasing the common assumptions for an integrative MS program theory 5) Developing the integrative MS program theory 6) Building the integrative MS treatment model. The model includes important elements of the different treatment philosophies represented in the team and thereby describes a common understanding of the complexity of the courses of treatment. Summary An integrative team of practitioners has developed an integrative model for combined treatments of People with Multiple Sclerosis. The model unites different treatment philosophies and focuses on process-oriented factors and the strengthening of the patients’ resources and competences on a physical, an emotional and a cognitive level. PMID:22524586

  7. A Generic Authentication LoA Derivation Model

    NASA Astrophysics Data System (ADS)

    Yao, Li; Zhang, Ning

    One way of achieving a more fine-grained access control is to link an authentication level of assurance (LoA) derived from a requester’s authentication instance to the authorisation decision made to the requester. To realise this vision, there is a need for designing a LoA derivation model that supports the use and quantification of multiple LoA-effecting attributes, and analyse their composite effect on a given authentication instance. This paper reports the design of such a model, namely a generic LoA derivation model (GEA- LoADM). GEA-LoADM takes into account of multiple authentication attributes along with their relationships, abstracts the composite effect by the multiple attributes into a generic value, authentication LoA, and provides algorithms for the run-time derivation of LoA. The algorithms are tailored to reflect the relationships among the attributes involved in an authentication instance. The model has a number of valuable properties, including flexibility and extensibility; it can be applied to different application contexts and support easy addition of new attributes and removal of obsolete ones.

  8. An integrated data model to estimate spatiotemporal occupancy, abundance, and colonization dynamics.

    PubMed

    Williams, Perry J; Hooten, Mevin B; Womble, Jamie N; Esslinger, George G; Bower, Michael R; Hefley, Trevor J

    2017-02-01

    Ecological invasions and colonizations occur dynamically through space and time. Estimating the distribution and abundance of colonizing species is critical for efficient management or conservation. We describe a statistical framework for simultaneously estimating spatiotemporal occupancy and abundance dynamics of a colonizing species. Our method accounts for several issues that are common when modeling spatiotemporal ecological data including multiple levels of detection probability, multiple data sources, and computational limitations that occur when making fine-scale inference over a large spatiotemporal domain. We apply the model to estimate the colonization dynamics of sea otters (Enhydra lutris) in Glacier Bay, in southeastern Alaska. © 2016 by the Ecological Society of America.

  9. Subtle Nonlinearity in Popular Album Charts

    NASA Astrophysics Data System (ADS)

    Bentley, R. Alexander; Maschner, Herbert D. G.

    Large-scale patterns of culture change may be explained by models of self organized criticality, or alternatively, by multiplicative processes. We speculate that popular album activity may be similar to critical models of extinction in that interconnected agents compete to survive within a limited space. Here we investigate whether popular music albums as listed on popular album charts display evidence of self-organized criticality, including a self-affine time series of activity and power-law distributions of lifetimes and exit activity in the chart. We find it difficult to distinguish between multiplicative growth and critical model hypotheses for these data. However, aspects of criticality may be masked by the selective sampling that a "Top 200" listing necessarily implies.

  10. Application of constraint-based satellite mission planning model in forest fire monitoring

    NASA Astrophysics Data System (ADS)

    Guo, Bingjun; Wang, Hongfei; Wu, Peng

    2017-10-01

    In this paper, a constraint-based satellite mission planning model is established based on the thought of constraint satisfaction. It includes target, request, observation, satellite, payload and other elements, with constraints linked up. The optimization goal of the model is to make full use of time and resources, and improve the efficiency of target observation. Greedy algorithm is used in the model solving to make observation plan and data transmission plan. Two simulation experiments are designed and carried out, which are routine monitoring of global forest fire and emergency monitoring of forest fires in Australia. The simulation results proved that the model and algorithm perform well. And the model is of good emergency response capability. Efficient and reasonable plan can be worked out to meet users' needs under complex cases of multiple payloads, multiple targets and variable priorities with this model.

  11. Multiscale sagebrush rangeland habitat modeling in southwest Wyoming

    USGS Publications Warehouse

    Homer, Collin G.; Aldridge, Cameron L.; Meyer, Debra K.; Coan, Michael J.; Bowen, Zachary H.

    2009-01-01

    Sagebrush-steppe ecosystems in North America have experienced dramatic elimination and degradation since European settlement. As a result, sagebrush-steppe dependent species have experienced drastic range contractions and population declines. Coordinated ecosystem-wide research, integrated with monitoring and management activities, would improve the ability to maintain existing sagebrush habitats. However, current data only identify resource availability locally, with rigorous spatial tools and models that accurately model and map sagebrush habitats over large areas still unavailable. Here we report on an effort to produce a rigorous large-area sagebrush-habitat classification and inventory with statistically validated products and estimates of precision in the State of Wyoming. This research employs a combination of significant new tools, including (1) modeling sagebrush rangeland as a series of independent continuous field components that can be combined and customized by any user at multiple spatial scales; (2) collecting ground-measured plot data on 2.4-meter imagery in the same season the satellite imagery is acquired; (3) effective modeling of ground-measured data on 2.4-meter imagery to maximize subsequent extrapolation; (4) acquiring multiple seasons (spring, summer, and fall) of an additional two spatial scales of imagery (30 meter and 56 meter) for optimal large-area modeling; (5) using regression tree classification technology that optimizes data mining of multiple image dates, ratios, and bands with ancillary data to extrapolate ground training data to coarser resolution sensors; and (6) employing rigorous accuracy assessment of model predictions to enable users to understand the inherent uncertainties. First-phase results modeled eight rangeland components (four primary targets and four secondary targets) as continuous field predictions. The primary targets included percent bare ground, percent herbaceousness, percent shrub, and percent litter. The four secondary targets included percent sagebrush (Artemisia spp.), percent big sagebrush (Artemisia tridentata), percent Wyoming sagebrush (Artemisia tridentata wyomingensis), and sagebrush height (centimeters). Results were validated by an independent accuracy assessment with root mean square error (RMSE) values ranging from 6.38 percent for bare ground to 2.99 percent for sagebrush at the QuickBird scale and RMSE values ranging from 12.07 percent for bare ground to 6.34 percent for sagebrush at the full Landsat scale. Subsequent project phases are now in progress, with plans to deliver products that improve accuracies of existing components, model new components, complete models over larger areas, track changes over time (from 1988 to 2007), and ultimately model wildlife population trends against these changes. We believe these results offer significant improvement in sagebrush rangeland quantification at multiple scales and offer users products that have been rigorously validated.

  12. Treatment of Fragile X Syndrome with a Neuroactive Steroid

    DTIC Science & Technology

    2014-08-01

    Figure 1) and GABA agonists (Figures 2 and 3). Currently, there are animal models of FXS that include the Fmr1-KO mouse and the Drosophila melanogaster ... the Drosophila (fruit fly) model of FXS that the GABAA system including multiple receptors is dramatically down-regulated. Ganaxolone is a drug that...810 males.14 The expansion of the trinucleotide sequence results in lowered FMRP levels. The premutation expansion results in a two- to eightfold

  13. Comparison of aerodynamic models for Vertical Axis Wind Turbines

    NASA Astrophysics Data System (ADS)

    Simão Ferreira, C.; Aagaard Madsen, H.; Barone, M.; Roscher, B.; Deglaire, P.; Arduin, I.

    2014-06-01

    Multi-megawatt Vertical Axis Wind Turbines (VAWTs) are experiencing an increased interest for floating offshore applications. However, VAWT development is hindered by the lack of fast, accurate and validated simulation models. This work compares six different numerical models for VAWTS: a multiple streamtube model, a double-multiple streamtube model, the actuator cylinder model, a 2D potential flow panel model, a 3D unsteady lifting line model, and a 2D conformal mapping unsteady vortex model. The comparison covers rotor configurations with two NACA0015 blades, for several tip speed ratios, rotor solidity and fixed pitch angle, included heavily loaded rotors, in inviscid flow. The results show that the streamtube models are inaccurate, and that correct predictions of rotor power and rotor thrust are an effect of error cancellation which only occurs at specific configurations. The other four models, which explicitly model the wake as a system of vorticity, show mostly differences due to the instantaneous or time averaged formulation of the loading and flow, for which further research is needed.

  14. Applying a Multiple Group Causal Indicator Modeling Framework to the Reading Comprehension Skills of Third, Seventh, and Tenth Grade Students

    PubMed Central

    Tighe, Elizabeth L.; Wagner, Richard K.; Schatschneider, Christopher

    2015-01-01

    This study demonstrates the utility of applying a causal indicator modeling framework to investigate important predictors of reading comprehension in third, seventh, and tenth grade students. The results indicated that a 4-factor multiple indicator multiple indicator cause (MIMIC) model of reading comprehension provided adequate fit at each grade level. This model included latent predictor constructs of decoding, verbal reasoning, nonverbal reasoning, and working memory and accounted for a large portion of the reading comprehension variance (73% to 87%) across grade levels. Verbal reasoning contributed the most unique variance to reading comprehension at all grade levels. In addition, we fit a multiple group 4-factor MIMIC model to investigate the relative stability (or variability) of the predictor contributions to reading comprehension across development (i.e., grade levels). The results revealed that the contributions of verbal reasoning, nonverbal reasoning, and working memory to reading comprehension were stable across the three grade levels. Decoding was the only predictor that could not be constrained to be equal across grade levels. The contribution of decoding skills to reading comprehension was higher in third grade and then remained relatively stable between seventh and tenth grade. These findings illustrate the feasibility of using MIMIC models to explain individual differences in reading comprehension across the development of reading skills. PMID:25821346

  15. [Feasibility of device closure for multiple atrial septal defects using 3D printing and ultrasound-guided intervention technique].

    PubMed

    Qiu, X; Lü, B; Xu, N; Yan, C W; Ouyang, W B; Liu, Y; Zhang, F W; Yue, Z Q; Pang, K J; Pan, X B

    2017-04-25

    Objective: To investigate the feasibility of trans-catheter closure of multiple atrial septal defects (ASD) monitored by trans-thoracic echocardiography (TTE) under the guidance of 3D printing heart model. Methods: Between April and August 2016, a total of 21 patients (8 male and 13 female) with multiple ASD in Fuwai Hospital of Chinese Academy of Medical Sciences underwent CT scan and 3-dimensional echocardiography for heart disease model produced by 3D printing technique. The best occlusion program was determined through the simulation test on the model. Percutaneous device closure of multiple ASD was performed follow the predetermined program guided by TTE. Clinical follow-up including electrocardiogram and TTE was arranged at 1 month after the procedure. Results: The trans-catheter procedure was successful in all 21 patients using a single atrial septal occluder. Mild residual shunt was found in 5 patient in the immediate postoperative period, 3 of them were disappeared during postoperative follow-up. There was no death, vascular damage, arrhythmia, device migration, thromboembolism, valvular dysfunction during the follow-up period. Conclusion: The use of 3D printing heart model provides a useful reference for transcatheter device closure of multiple ASD achieving through ultrasound-guided intervention technique, which appears to be safe and feasible with good outcomes of short-term follow-up.

  16. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  17. Building effective working relationships across culturally and ethnically diverse communities.

    PubMed

    Hosley, Cheryl A; Gensheimer, Linda; Yang, Mai

    2003-01-01

    Amherst H. Wilder Foundation's Social Adjustment Program for Southeast Asians is implementing two collaborative, best practice, mental health and substance abuse prevention service models in Minnesota. It faced several issues in effectively bridging multiple cultural groups, including building a diverse collaborative team, involving families and youth, reconciling cultural variation in meeting styles, and making best practice models culturally appropriate. Researchers and program staff used multiple strategies to address these challenges and build successful partnerships. Through shared goals, flexibility, and a willingness to explore and address challenges, collaboratives can promote stronger relationships across cultural communities and improve their service delivery systems.

  18. "Every Child Ready": Exposure to a Comprehensive Instructional Model Improves Students' Growth Trajectories in Multiple Early Learning Domains

    ERIC Educational Resources Information Center

    Carlson, Abby G.; Curby, Timothy W.; Brown, Chavaughn A.; Truong, Felicia R.

    2017-01-01

    The current study investigates the impact of Every Child Ready (ECR), a comprehensive instructional model that includes: "What to teach, how to teach and how to know instruction is effective." The ECR instructional model is designed to provide high quality instruction to children via a play-based, thematic curriculum. Participants…

  19. Introductory Biology Students’ Conceptual Models and Explanations of the Origin of Variation

    PubMed Central

    Shaw, Neil; Momsen, Jennifer; Reinagel, Adam; Le, Paul; Taqieddin, Ranya; Long, Tammy

    2014-01-01

    Mutation is the key molecular mechanism generating phenotypic variation, which is the basis for evolution. In an introductory biology course, we used a model-based pedagogy that enabled students to integrate their understanding of genetics and evolution within multiple case studies. We used student-generated conceptual models to assess understanding of the origin of variation. By midterm, only a small percentage of students articulated complete and accurate representations of the origin of variation in their models. Targeted feedback was offered through activities requiring students to critically evaluate peers’ models. At semester's end, a substantial proportion of students significantly improved their representation of how variation arises (though one-third still did not include mutation in their models). Students’ written explanations of the origin of variation were mostly consistent with their models, although less effective than models in conveying mechanistic reasoning. This study contributes evidence that articulating the genetic origin of variation is particularly challenging for learners and may require multiple cycles of instruction, assessment, and feedback. To support meaningful learning of the origin of variation, we advocate instruction that explicitly integrates multiple scales of biological organization, assessment that promotes and reveals mechanistic and causal reasoning, and practice with explanatory models with formative feedback. PMID:25185235

  20. Evidence synthesis to inform model-based cost-effectiveness evaluations of diagnostic tests: a methodological review of health technology assessments.

    PubMed

    Shinkins, Bethany; Yang, Yaling; Abel, Lucy; Fanshawe, Thomas R

    2017-04-14

    Evaluations of diagnostic tests are challenging because of the indirect nature of their impact on patient outcomes. Model-based health economic evaluations of tests allow different types of evidence from various sources to be incorporated and enable cost-effectiveness estimates to be made beyond the duration of available study data. To parameterize a health-economic model fully, all the ways a test impacts on patient health must be quantified, including but not limited to diagnostic test accuracy. We assessed all UK NIHR HTA reports published May 2009-July 2015. Reports were included if they evaluated a diagnostic test, included a model-based health economic evaluation and included a systematic review and meta-analysis of test accuracy. From each eligible report we extracted information on the following topics: 1) what evidence aside from test accuracy was searched for and synthesised, 2) which methods were used to synthesise test accuracy evidence and how did the results inform the economic model, 3) how/whether threshold effects were explored, 4) how the potential dependency between multiple tests in a pathway was accounted for, and 5) for evaluations of tests targeted at the primary care setting, how evidence from differing healthcare settings was incorporated. The bivariate or HSROC model was implemented in 20/22 reports that met all inclusion criteria. Test accuracy data for health economic modelling was obtained from meta-analyses completely in four reports, partially in fourteen reports and not at all in four reports. Only 2/7 reports that used a quantitative test gave clear threshold recommendations. All 22 reports explored the effect of uncertainty in accuracy parameters but most of those that used multiple tests did not allow for dependence between test results. 7/22 tests were potentially suitable for primary care but the majority found limited evidence on test accuracy in primary care settings. The uptake of appropriate meta-analysis methods for synthesising evidence on diagnostic test accuracy in UK NIHR HTAs has improved in recent years. Future research should focus on other evidence requirements for cost-effectiveness assessment, threshold effects for quantitative tests and the impact of multiple diagnostic tests.

  1. A Comparison between Multiple Regression Models and CUN-BAE Equation to Predict Body Fat in Adults

    PubMed Central

    Fuster-Parra, Pilar; Bennasar-Veny, Miquel; Tauler, Pedro; Yañez, Aina; López-González, Angel A.; Aguiló, Antoni

    2015-01-01

    Background Because the accurate measure of body fat (BF) is difficult, several prediction equations have been proposed. The aim of this study was to compare different multiple regression models to predict BF, including the recently reported CUN-BAE equation. Methods Multi regression models using body mass index (BMI) and body adiposity index (BAI) as predictors of BF will be compared. These models will be also compared with the CUN-BAE equation. For all the analysis a sample including all the participants and another one including only the overweight and obese subjects will be considered. The BF reference measure was made using Bioelectrical Impedance Analysis. Results The simplest models including only BMI or BAI as independent variables showed that BAI is a better predictor of BF. However, adding the variable sex to both models made BMI a better predictor than the BAI. For both the whole group of participants and the group of overweight and obese participants, using simple models (BMI, age and sex as variables) allowed obtaining similar correlations with BF as when the more complex CUN-BAE was used (ρ = 0:87 vs. ρ = 0:86 for the whole sample and ρ = 0:88 vs. ρ = 0:89 for overweight and obese subjects, being the second value the one for CUN-BAE). Conclusions There are simpler models than CUN-BAE equation that fits BF as well as CUN-BAE does. Therefore, it could be considered that CUN-BAE overfits. Using a simple linear regression model, the BAI, as the only variable, predicts BF better than BMI. However, when the sex variable is introduced, BMI becomes the indicator of choice to predict BF. PMID:25821960

  2. A comparison between multiple regression models and CUN-BAE equation to predict body fat in adults.

    PubMed

    Fuster-Parra, Pilar; Bennasar-Veny, Miquel; Tauler, Pedro; Yañez, Aina; López-González, Angel A; Aguiló, Antoni

    2015-01-01

    Because the accurate measure of body fat (BF) is difficult, several prediction equations have been proposed. The aim of this study was to compare different multiple regression models to predict BF, including the recently reported CUN-BAE equation. Multi regression models using body mass index (BMI) and body adiposity index (BAI) as predictors of BF will be compared. These models will be also compared with the CUN-BAE equation. For all the analysis a sample including all the participants and another one including only the overweight and obese subjects will be considered. The BF reference measure was made using Bioelectrical Impedance Analysis. The simplest models including only BMI or BAI as independent variables showed that BAI is a better predictor of BF. However, adding the variable sex to both models made BMI a better predictor than the BAI. For both the whole group of participants and the group of overweight and obese participants, using simple models (BMI, age and sex as variables) allowed obtaining similar correlations with BF as when the more complex CUN-BAE was used (ρ = 0:87 vs. ρ = 0:86 for the whole sample and ρ = 0:88 vs. ρ = 0:89 for overweight and obese subjects, being the second value the one for CUN-BAE). There are simpler models than CUN-BAE equation that fits BF as well as CUN-BAE does. Therefore, it could be considered that CUN-BAE overfits. Using a simple linear regression model, the BAI, as the only variable, predicts BF better than BMI. However, when the sex variable is introduced, BMI becomes the indicator of choice to predict BF.

  3. Motion control of musculoskeletal systems with redundancy.

    PubMed

    Park, Hyunjoo; Durand, Dominique M

    2008-12-01

    Motion control of musculoskeletal systems for functional electrical stimulation (FES) is a challenging problem due to the inherent complexity of the systems. These include being highly nonlinear, strongly coupled, time-varying, time-delayed, and redundant. The redundancy in particular makes it difficult to find an inverse model of the system for control purposes. We have developed a control system for multiple input multiple output (MIMO) redundant musculoskeletal systems with little prior information. The proposed method separates the steady-state properties from the dynamic properties. The dynamic control uses a steady-state inverse model and is implemented with both a PID controller for disturbance rejection and an artificial neural network (ANN) feedforward controller for fast trajectory tracking. A mechanism to control the sum of the muscle excitation levels is also included. To test the performance of the proposed control system, a two degree of freedom ankle-subtalar joint model with eight muscles was used. The simulation results show that separation of steady-state and dynamic control allow small output tracking errors for different reference trajectories such as pseudo-step, sinusoidal and filtered random signals. The proposed control method also demonstrated robustness against system parameter and controller parameter variations. A possible application of this control algorithm is FES control using multiple contact cuff electrodes where mathematical modeling is not feasible and the redundancy makes the control of dynamic movement difficult.

  4. Preclinical studies in support of defibrotide for the treatment of multiple myeloma and other neoplasias.

    PubMed

    Mitsiades, Constantine S; Rouleau, Cecile; Echart, Cinara; Menon, Krishna; Teicher, Beverly; Distaso, Maria; Palumbo, Antonio; Boccadoro, Mario; Anderson, Kenneth C; Iacobelli, Massimo; Richardson, Paul G

    2009-02-15

    Defibrotide, an orally bioavailable polydisperse oligonucleotide, has promising activity in hepatic veno-occlusive disease, a stem cell transplantation-related toxicity characterized by microangiopathy. The antithrombotic properties of defibrotide and its minimal hemorrhagic risk could serve for treatment of cancer-associated thrombotic complications. Given its cytoprotective effect on endothelium, we investigated whether defibrotide protects tumor cells from cytotoxic antitumor agents. Further, given its antiadhesive properties, we evaluated whether defibrotide modulates the protection conferred to multiple myeloma cells by bone marrow stromal cells. Defibrotide lacks significant single-agent in vitro cytotoxicity on multiple myeloma or solid tumor cells and does not attenuate their in vitro response to dexamethasone, bortezomib, immunomodulatory thalidomide derivatives, and conventional chemotherapeutics, including melphalan and cyclophosphamide. Importantly, defibrotide enhances in vivo chemosensitivity of multiple myeloma and mammary carcinoma xenografts in animal models. In cocultures of multiple myeloma cells with bone marrow stromal cells in vitro, defibrotide enhances the multiple myeloma cell sensitivity to melphalan and dexamethasone, and decreases multiple myeloma-bone marrow stromal cell adhesion and its sequelae, including nuclear factor-kappaB activation in multiple myeloma and bone marrow stromal cells, and associated cytokine production. Moreover, defibrotide inhibits expression and/or function of key mediators of multiple myeloma interaction with bone marrow stromal cell and endothelium, including heparanase, angiogenic cytokines, and adhesion molecules. Defibrotide's in vivo chemosensitizing properties and lack of direct in vitro activity against tumor cells suggest that it favorably modulates antitumor interactions between bone marrow stromal cells and endothelia in the tumor microenvironment. These data support clinical studies of defibrotide in combination with conventional and novel therapies to potentially improve patient outcome in multiple myeloma and other malignancies.

  5. Improving ecosystem-scale modeling of evapotranspiration using ecological mechanisms that account for compensatory responses following disturbance

    NASA Astrophysics Data System (ADS)

    Millar, David J.; Ewers, Brent E.; Mackay, D. Scott; Peckham, Scott; Reed, David E.; Sekoni, Adewale

    2017-09-01

    Mountain pine beetle outbreaks in western North America have led to extensive forest mortality, justifiably generating interest in improving our understanding of how this type of ecological disturbance affects hydrological cycles. While observational studies and simulations have been used to elucidate the effects of mountain beetle mortality on hydrological fluxes, an ecologically mechanistic model of forest evapotranspiration (ET) evaluated against field data has yet to be developed. In this work, we use the Terrestrial Regional Ecosystem Exchange Simulator (TREES) to incorporate the ecohydrological impacts of mountain pine beetle disturbance on ET for a lodgepole pine-dominated forest equipped with an eddy covariance tower. An existing degree-day model was incorporated that predicted the life cycle of mountain pine beetles, along with an empirically derived submodel that allowed sap flux to decline as a function of temperature-dependent blue stain fungal growth. The eddy covariance footprint was divided into multiple cohorts for multiple growing seasons, including representations of recently attacked trees and the compensatory effects of regenerating understory, using two different spatial scaling methods. Our results showed that using a multiple cohort approach matched eddy covariance-measured ecosystem-scale ET fluxes well, and showed improved performance compared to model simulations assuming a binary framework of only areas of live and dead overstory. Cumulative growing season ecosystem-scale ET fluxes were 8 - 29% greater using the multicohort approach during years in which beetle attacks occurred, highlighting the importance of including compensatory ecological mechanism in ET models.

  6. High Spectral Resolution Lidar Measurements of Multiple Scattering

    NASA Technical Reports Server (NTRS)

    Eloranta, E. W.; Piironen, P.

    1996-01-01

    The University of Wisconsin High Spectral Resolution Lidar (HSRL) provides unambiguous measurements of backscatter cross section, backscatter phase function, depolarization, and optical depth. This is accomplished by dividing the lidar return into separate particulate and molecular contributions. The molecular return is then used as a calibration target. We have modified the HSRL to use an I2 molecular absorption filter to separate aerosol and molecular signals. This allows measurement in dense clouds. Useful profiles extend above the cloud base until the two way optical depth reaches values between 5 and 6; beyond this, photon counting errors become large. In order to observe multiple scattering, the HSRL includes a channel which records the combined aerosol and molecular lidar return simultaneously with the spectrometer channel measurements of optical properties. This paper describes HSRL multiple scattering measurements from both water and ice clouds. These include signal strengths and depolarizations as a function of receiver field of view. All observations include profiles of extinction and backscatter cross sections. Measurements are also compared to predictions of a multiple scattering model based on small angle approximations.

  7. Institutional Transformation Version 2.5 Modeling and Planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villa, Daniel; Mizner, Jack H.; Passell, Howard D.

    Reducing the resource consumption and emissions of large institutions is an important step toward a sustainable future. Sandia National Laboratories' (SNL) Institutional Transformation (IX) project vision is to provide tools that enable planners to make well-informed decisions concerning sustainability, resource conservation, and emissions reduction across multiple sectors. The building sector has been the primary focus so far because it is the largest consumer of resources for SNL. The IX building module allows users to define the evolution of many buildings over time. The module has been created so that it can be generally applied to any set of DOE-2 (more » http://doe2.com ) building models that have been altered to include parameters and expressions required by energy conservation measures (ECM). Once building models have been appropriately prepared, they are checked into a Microsoft Access (r) database. Each building can be represented by many models. This enables the capability to keep a continuous record of models in the past, which are replaced with different models as changes occur to the building. In addition to this, the building module has the capability to apply climate scenarios through applying different weather files to each simulation year. Once the database has been configured, a user interface in Microsoft Excel (r) is used to create scenarios with one or more ECMs. The capability to include central utility buildings (CUBs) that service more than one building with chilled water has been developed. A utility has been created that joins multiple building models into a single model. After using the utility, several manual steps are required to complete the process. Once this CUB model has been created, the individual contributions of each building are still tracked through meters. Currently, 120 building models from SNL's New Mexico and California campuses have been created. This includes all buildings at SNL greater than 10,000 sq. ft., representing 80% of the energy consumption at SNL. SNL has been able to leverage this model to estimate energy savings potential of many competing ECMs. The results helped high level decision makers to create energy reduction goals for SNL. These resources also have multiple applications for use of the models as individual buildings. In addition to the building module, a solar module built in Powersim Studio (r) allows planners to evaluate the potential photovoltaic (PV) energy generation potential for flat plate PV, concentrating solar PV, and concentration solar thermal technologies at multiple sites across SNL's New Mexico campus. Development of the IX modeling framework was a unique collaborative effort among planners and engineers in SNL's facilities division; scientists and computer modelers in SNL's research and development division; faculty from Arizona State University; and energy modelers from Bridger and Paxton Consulting Engineers Incorporated.« less

  8. Capacity Expansion Modeling for Storage Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Stoll, Brady; Mai, Trieu

    2017-04-03

    The Resource Planning Model (RPM) is a capacity expansion model designed for regional power systems and high levels of renewable generation. Recent extensions capture value-stacking for storage technologies, including batteries and concentrating solar power with storage. After estimating per-unit capacity value and curtailment reduction potential, RPM co-optimizes investment decisions and reduced-form dispatch, accounting for planning reserves; energy value, including arbitrage and curtailment reduction; and three types of operating reserves. Multiple technology cost scenarios are analyzed to determine level of deployment in the Western Interconnection under various conditions.

  9. Factors Influencing Amount of Weekly Exercise Time in Colorectal Cancer Survivors.

    PubMed

    Chou, Yun-Jen; Lai, Yeur-Hur; Lin, Been-Ren; Liang, Jin-Tung; Shun, Shiow-Ching

    Performing regular exercise of at least 150 minutes weekly has benefits for colorectal cancer survivors. However, barriers inhibit these survivors from performing regular exercise. The aim of this study was to explore exercise behaviors and significant factors influencing weekly exercise time of more than 150 minutes in colorectal cancer survivors. A cross-sectional study design was used to recruit participants in Taiwan. Guided by the ecological model of health behavior, exercise barriers were assessed including intrapersonal, interpersonal, and environment-related barriers. A multiple logistic regression was used to explore the factors associated with the amount of weekly exercise. Among 321 survivors, 57.0% of them had weekly exercise times of more than 150 minutes. The results identified multiple levels of significant factors related to weekly exercise times including intrapersonal factors (occupational status, functional status, pain, interest in exercise, and beliefs about the importance of exercise) and exercise barriers related to environmental factors (lack of time and bad weather). No interpersonal factors were found to be significant. Colorectal cancer survivors experienced low levels of physical and psychological distress. Multiple levels of significant factors related to exercise time including intrapersonal factors as well as exercise barriers related to environmental factors should be considered. Healthcare providers should discuss with their patients how to perform exercise programs; the discussion should address multiple levels of the ecological model such as any pain problems, functional status, employment status, and time limitations, as well as community environment.

  10. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    NASA Technical Reports Server (NTRS)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  11. Detection of person borne IEDs using multiple cooperative sensors

    NASA Astrophysics Data System (ADS)

    MacIntosh, Scott; Deming, Ross; Hansen, Thorkild; Kishan, Neel; Tang, Ling; Shea, Jing; Lang, Stephen

    2011-06-01

    The use of multiple cooperative sensors for the detection of person borne IEDs is investigated. The purpose of the effort is to evaluate the performance benefits of adding multiple sensor data streams into an aided threat detection algorithm, and a quantitative analysis of which sensor data combinations improve overall detection performance. Testing includes both mannequins and human subjects with simulated suicide bomb devices of various configurations, materials, sizes and metal content. Aided threat recognition algorithms are being developed to test detection performance of individual sensors against combined fused sensors inputs. Sensors investigated include active and passive millimeter wave imaging systems, passive infrared, 3-D profiling sensors and acoustic imaging. The paper describes the experimental set-up and outlines the methodology behind a decision fusion algorithm-based on the concept of a "body model".

  12. Mark-recapture with multiple, non-invasive marks.

    PubMed

    Bonner, Simon J; Holmberg, Jason

    2013-09-01

    Non-invasive marks, including pigmentation patterns, acquired scars, and genetic markers, are often used to identify individuals in mark-recapture experiments. If animals in a population can be identified from multiple, non-invasive marks then some individuals may be counted twice in the observed data. Analyzing the observed histories without accounting for these errors will provide incorrect inference about the population dynamics. Previous approaches to this problem include modeling data from only one mark and combining estimators obtained from each mark separately assuming that they are independent. Motivated by the analysis of data from the ECOCEAN online whale shark (Rhincodon typus) catalog, we describe a Bayesian method to analyze data from multiple, non-invasive marks that is based on the latent-multinomial model of Link et al. (2010, Biometrics 66, 178-185). Further to this, we describe a simplification of the Markov chain Monte Carlo algorithm of Link et al. (2010, Biometrics 66, 178-185) that leads to more efficient computation. We present results from the analysis of the ECOCEAN whale shark data and from simulation studies comparing our method with the previous approaches. © 2013, The International Biometric Society.

  13. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  14. Multiple-Input Subject-Specific Modeling of Plasma Glucose Concentration for Feedforward Control.

    PubMed

    Kotz, Kaylee; Cinar, Ali; Mei, Yong; Roggendorf, Amy; Littlejohn, Elizabeth; Quinn, Laurie; Rollins, Derrick K

    2014-11-26

    The ability to accurately develop subject-specific, input causation models, for blood glucose concentration (BGC) for large input sets can have a significant impact on tightening control for insulin dependent diabetes. More specifically, for Type 1 diabetics (T1Ds), it can lead to an effective artificial pancreas (i.e., an automatic control system that delivers exogenous insulin) under extreme changes in critical disturbances. These disturbances include food consumption, activity variations, and physiological stress changes. Thus, this paper presents a free-living, outpatient, multiple-input, modeling method for BGC with strong causation attributes that is stable and guards against overfitting to provide an effective modeling approach for feedforward control (FFC). This approach is a Wiener block-oriented methodology, which has unique attributes for meeting critical requirements for effective, long-term, FFC.

  15. Dinucleotide controlled null models for comparative RNA gene prediction.

    PubMed

    Gesell, Tanja; Washietl, Stefan

    2008-05-27

    Comparative prediction of RNA structures can be used to identify functional noncoding RNAs in genomic screens. It was shown recently by Babak et al. [BMC Bioinformatics. 8:33] that RNA gene prediction programs can be biased by the genomic dinucleotide content, in particular those programs using a thermodynamic folding model including stacking energies. As a consequence, there is need for dinucleotide-preserving control strategies to assess the significance of such predictions. While there have been randomization algorithms for single sequences for many years, the problem has remained challenging for multiple alignments and there is currently no algorithm available. We present a program called SISSIz that simulates multiple alignments of a given average dinucleotide content. Meeting additional requirements of an accurate null model, the randomized alignments are on average of the same sequence diversity and preserve local conservation and gap patterns. We make use of a phylogenetic substitution model that includes overlapping dependencies and site-specific rates. Using fast heuristics and a distance based approach, a tree is estimated under this model which is used to guide the simulations. The new algorithm is tested on vertebrate genomic alignments and the effect on RNA structure predictions is studied. In addition, we directly combined the new null model with the RNAalifold consensus folding algorithm giving a new variant of a thermodynamic structure based RNA gene finding program that is not biased by the dinucleotide content. SISSIz implements an efficient algorithm to randomize multiple alignments preserving dinucleotide content. It can be used to get more accurate estimates of false positive rates of existing programs, to produce negative controls for the training of machine learning based programs, or as standalone RNA gene finding program. Other applications in comparative genomics that require randomization of multiple alignments can be considered. SISSIz is available as open source C code that can be compiled for every major platform and downloaded here: http://sourceforge.net/projects/sissiz.

  16. Stochastic analog neutron transport with TRIPOLI-4 and FREYA: Bayesian uncertainty quantification for neutron multiplicity counting

    DOE PAGES

    Verbeke, J. M.; Petit, O.

    2016-06-01

    From nuclear safeguards to homeland security applications, the need for the better modeling of nuclear interactions has grown over the past decades. Current Monte Carlo radiation transport codes compute average quantities with great accuracy and performance; however, performance and averaging come at the price of limited interaction-by-interaction modeling. These codes often lack the capability of modeling interactions exactly: for a given collision, energy is not conserved, energies of emitted particles are uncorrelated, and multiplicities of prompt fission neutrons and photons are uncorrelated. Many modern applications require more exclusive quantities than averages, such as the fluctuations in certain observables (e.g., themore » neutron multiplicity) and correlations between neutrons and photons. In an effort to meet this need, the radiation transport Monte Carlo code TRIPOLI-4® was modified to provide a specific mode that models nuclear interactions in a full analog way, replicating as much as possible the underlying physical process. Furthermore, the computational model FREYA (Fission Reaction Event Yield Algorithm) was coupled with TRIPOLI-4 to model complete fission events. As a result, FREYA automatically includes fluctuations as well as correlations resulting from conservation of energy and momentum.« less

  17. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  18. Note on Professor Sizer's Paper.

    ERIC Educational Resources Information Center

    Balderston, Frederick E.

    1979-01-01

    Issues suggested by John Sizer's paper, an overview of the assessment of institutional performance, include: the efficient-frontier approach, multiple-criterion decision-making models, performance analysis approached as path analysis, and assessment of academic quality. (JMD)

  19. Recent Updates to the System Advisor Model (SAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiOrio, Nicholas A

    The System Advisor Model (SAM) is a mature suite of techno-economic models for many renewable energy technologies that can be downloaded for free as a desktop application or software development kit. SAM is used for system-level modeling, including generating performance pro the release of the code as an open source project on GitHub. Other additions that will be covered include the ability to download data directly into SAM from the National Solar Radiation Database (NSRDB) and up- dates to a user-interface macro that assists with PV system sizing. A brief update on SAM's battery model and its integration with themore » detailed photovoltaic model will also be discussed. Finally, an outline of planned work for the next year will be presented, including the addition of a bifacial model, support for multiple MPPT inputs for detailed inverter modeling, and the addition of a model for inverter thermal behavior.« less

  20. Parallel updating and weighting of multiple spatial maps for visual stability during whole body motion

    PubMed Central

    Medendorp, W. P.

    2015-01-01

    It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms. PMID:26490289

  1. Subject-specific body segment parameter estimation using 3D photogrammetry with multiple cameras

    PubMed Central

    Morris, Mark; Sellers, William I.

    2015-01-01

    Inertial properties of body segments, such as mass, centre of mass or moments of inertia, are important parameters when studying movements of the human body. However, these quantities are not directly measurable. Current approaches include using regression models which have limited accuracy: geometric models with lengthy measuring procedures or acquiring and post-processing MRI scans of participants. We propose a geometric methodology based on 3D photogrammetry using multiple cameras to provide subject-specific body segment parameters while minimizing the interaction time with the participants. A low-cost body scanner was built using multiple cameras and 3D point cloud data generated using structure from motion photogrammetric reconstruction algorithms. The point cloud was manually separated into body segments, and convex hulling applied to each segment to produce the required geometric outlines. The accuracy of the method can be adjusted by choosing the number of subdivisions of the body segments. The body segment parameters of six participants (four male and two female) are presented using the proposed method. The multi-camera photogrammetric approach is expected to be particularly suited for studies including populations for which regression models are not available in literature and where other geometric techniques or MRI scanning are not applicable due to time or ethical constraints. PMID:25780778

  2. Subject-specific body segment parameter estimation using 3D photogrammetry with multiple cameras.

    PubMed

    Peyer, Kathrin E; Morris, Mark; Sellers, William I

    2015-01-01

    Inertial properties of body segments, such as mass, centre of mass or moments of inertia, are important parameters when studying movements of the human body. However, these quantities are not directly measurable. Current approaches include using regression models which have limited accuracy: geometric models with lengthy measuring procedures or acquiring and post-processing MRI scans of participants. We propose a geometric methodology based on 3D photogrammetry using multiple cameras to provide subject-specific body segment parameters while minimizing the interaction time with the participants. A low-cost body scanner was built using multiple cameras and 3D point cloud data generated using structure from motion photogrammetric reconstruction algorithms. The point cloud was manually separated into body segments, and convex hulling applied to each segment to produce the required geometric outlines. The accuracy of the method can be adjusted by choosing the number of subdivisions of the body segments. The body segment parameters of six participants (four male and two female) are presented using the proposed method. The multi-camera photogrammetric approach is expected to be particularly suited for studies including populations for which regression models are not available in literature and where other geometric techniques or MRI scanning are not applicable due to time or ethical constraints.

  3. Suicidal ideation among suburban adolescents: The influence of school bullying and other mediating risk factors.

    PubMed

    Lardier, David T; Barrios, Veronica R; Garcia-Reid, Pauline; Reid, Robert J

    2016-10-01

    Prior research has identified multiple factors that influence suicidal ideation (SI) among bullied youth. The effects of school bullying on SI cannot be considered in isolation. In this study, we examined the influence of school bullying on SI, through a constellation of risks, which include depressive and anxiety symptoms, family conflict, and alcohol, tobacco, and other drug (ATOD) use. We also provide recommendations for therapists working with bullied youth. Our sample consisted of 488 adolescents (ages 10-18 years) from a northern New Jersey, United States suburban community. Students were recruited through the district's physical education and health classes. Students responded to multiple measures, which included family cohesion/conflict, ATOD use, mental health indicators, SI, and school bullying experiences. Following preliminary analyses, several logistic regression models were used to assess the direct influence of bullying on SI, as well as the unique effects of family conflict, depressive and anxiety symptoms, and substance use. In addition, a parallel multiple mediating model with the PROCESS macro in SPSS was used to further assess mediating effects. Logistic regression results indicated that school bullying increased the odds of SI among males and females and that when mediating variables were added to the model, bullying no longer had a significant influence on SI. Overall, these results display that for both males and females, school bullying was a significant contributor to SI. Results from the parallel multiple mediating model further illustrated the mediating effects that family conflict, depression, and ATOD use had between bullying and SI. Some variation was noted based on gender. This study draws attention to the multiple experiences associated with school bullying on SI, and how these results may differ by gender. The results of this study are particularly important for those working directly and indirectly with bullied youth. Therapists that engage bullied youth need to consider the multiple spheres of influence that may increase SI among male and female clients. To holistically and adequately assess SI among bullied youth, therapists must also consider how these mechanisms vary between gender groups.

  4. Towards a Multi-Stakeholder-Driven Model for Excellence in Higher Education Curriculum Development

    ERIC Educational Resources Information Center

    Meyer, M. H.; Bushney, M. J.

    2008-01-01

    A multi-stakeholder-driven model for excellence in higher education curriculum development has been developed. It is based on the assumption that current efforts to curriculum development take place within a framework of limited stakeholder consultation. A total of 18 multiple stakeholders are identified, including learners, alumni, government,…

  5. The Measurement and Cost of Removing Unexplained Gender Differences in Faculty Salaries.

    ERIC Educational Resources Information Center

    Becker, William E.; Toutkoushian, Robert K.

    1995-01-01

    In assessing sex-discrimination suit damages, debate rages over the type and number of variables included in a single-equation model of the salary-determination process. This article considers single- and multiple-equation models, providing 36 different damage calculations. For University of Minnesota data, equalization cost hinges on the…

  6. Fiscal Viability, Conjunctive and Compensatory Models, and Career-Ladder Decisions: An Empirical Investigation.

    ERIC Educational Resources Information Center

    Mehrens, William A.; And Others

    A study was undertaken to explore cost-effective ways of making career ladder teacher evaluation system decisions based on fewer measures, assessing the relationship of observational variables to other data and final decisions, and comparison of compensatory and conjunctive decision models. Data included multiple scores from eight data sources in…

  7. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  8. A simulation study to determine the attenuation and bias in health risk estimates due to exposure measurement error in bi-pollutant models

    EPA Science Inventory

    To understand the combined health effects of exposure to ambient air pollutant mixtures, it is becoming more common to include multiple pollutants in epidemiologic models. However, the complex spatial and temporal pattern of ambient pollutant concentrations and related exposures ...

  9. A Socioecological Model of Rape Survivors' Decisions to Aid in Case Prosecution

    ERIC Educational Resources Information Center

    Anders, Mary C.; Christopher, F. Scott

    2011-01-01

    The purpose of our study was to identify factors underlying rape survivors' post-assault prosecution decisions by testing a decision model that included the complex relations between the multiple social ecological systems within which rape survivors are embedded. We coded 440 police rape cases for characteristics of the assault and characteristics…

  10. Development and Application of a Human PBPK Model for Bromodichloromethane (BDCM) to Investigate Impacts of Multi-Route Exposure

    EPA Science Inventory

    Due to its presence in water as a volatile disinfection byproduct, BDCM, which is mutagenic and a rodent carcinogen, poses a risk for exposure via multiple routes. We developed a refined human PBPK model for BDCM (including new chemical-specific human parameters) to evaluate the...

  11. Video Modeling for Children and Adolescents with Autism Spectrum Disorder: A Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Teresa Lynn

    2014-01-01

    The objective of this research was to conduct a meta-analysis to examine existing research studies on video modeling as an effective teaching tool for children and adolescents diagnosed with Autism Spectrum Disorder (ASD). Study eligibility criteria included (a) single case research design using multiple baselines, alternating treatment designs,…

  12. Effects of multiple scattering and surface albedo on the photochemistry of the troposphere

    NASA Technical Reports Server (NTRS)

    Augustsson, T. R.; Tiwari, S. N.

    1981-01-01

    The effect of treatment of incoming solar radiation on the photochemistry of the troposphere is discussed. A one dimensional photochemical model of the troposphere containing the species of the nitrogen, oxygen, carbon, hydrogen, and sulfur families was developed. The vertical flux is simulated by use of the parameterized eddy diffusion coefficients. The photochemical model is coupled to a radiative transfer model that calculates the radiation field due to the incoming solar radiation which initiates much of the photochemistry of the troposphere. Vertical profiles of tropospheric species were compared with the Leighton approximation, radiative transfer, matrix inversion model. The radiative transfer code includes the effects of multiple scattering due to molecules and aerosols, pure absorption, and surface albedo on the transfer of incoming solar radiation. It is indicated that significant differences exist for several key photolysis frequencies and species number density profiles between the Leighton approximation and the profiles generated with, radiative transfer, matrix inversion technique. Most species show enhanced vertical profiles when the more realistic treatment of the incoming solar radiation field is included

  13. Effects of multiple scattering and surface albedo on the photochemistry of the troposphere. Final report, period ending 30 Nov 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustsson, T.R.; Tiwari, S.N.

    The effect of treatment of incoming solar radiation on the photochemistry of the troposphere is discussed. A one dimensional photochemical model of the troposphere containing the species of the nitrogen, oxygen, carbon, hydrogen, and sulfur families was developed. The vertical flux is simulated by use of the parameterized eddy diffusion coefficients. The photochemical model is coupled to a radiative transfer model that calculates the radiation field due to the incoming solar radiation which initiates much of the photochemistry of the troposphere. Vertical profiles of tropospheric species were compared with the Leighton approximation, radiative transfer, matrix inversion model. The radiative transfermore » code includes the effects of multiple scattering due to molecules and aerosols, pure absorption, and surface albedo on the transfer of incoming solar radiation. It is indicated that significant differences exist for several key photolysis frequencies and species number density profiles between the Leighton approximation and the profiles generated with, radiative transfer, matrix inversion technique. Most species show enhanced vertical profiles when the more realistic treatment of the incoming solar radiation field is included« less

  14. Dry wind tunnel system

    NASA Technical Reports Server (NTRS)

    Chen, Ping-Chih (Inventor)

    2013-01-01

    This invention is a ground flutter testing system without a wind tunnel, called Dry Wind Tunnel (DWT) System. The DWT system consists of a Ground Vibration Test (GVT) hardware system, a multiple input multiple output (MIMO) force controller software, and a real-time unsteady aerodynamic force generation software, that is developed from an aerodynamic reduced order model (ROM). The ground flutter test using the DWT System operates on a real structural model, therefore no scaled-down structural model, which is required by the conventional wind tunnel flutter test, is involved. Furthermore, the impact of the structural nonlinearities on the aeroelastic stability can be included automatically. Moreover, the aeroservoelastic characteristics of the aircraft can be easily measured by simply including the flight control system in-the-loop. In addition, the unsteady aerodynamics generated computationally is interference-free from the wind tunnel walls. Finally, the DWT System can be conveniently and inexpensively carried out as a post GVT test with the same hardware, only with some possible rearrangement of the shakers and the inclusion of additional sensors.

  15. A systematic review of models used in cost-effectiveness analyses of preventing osteoporotic fractures.

    PubMed

    Si, L; Winzenberg, T M; Palmer, A J

    2014-01-01

    This review was aimed at the evolution of health economic models used in evaluations of clinical approaches aimed at preventing osteoporotic fractures. Models have improved, with medical continuance becoming increasingly recognized as a contributor to health and economic outcomes, as well as advancements in epidemiological data. Model-based health economic evaluation studies are increasingly used to investigate the cost-effectiveness of osteoporotic fracture preventions and treatments. The objective of this study was to carry out a systematic review of the evolution of health economic models used in the evaluation of osteoporotic fracture preventions. Electronic searches within MEDLINE and EMBASE were carried out using a predefined search strategy. Inclusion and exclusion criteria were used to select relevant studies. References listed of included studies were searched to identify any potential study that was not captured in our electronic search. Data on country, interventions, type of fracture prevention, evaluation perspective, type of model, time horizon, fracture sites, expressed costs, types of costs included, and effectiveness measurement were extracted. Seventy-four models were described in 104 publications, of which 69% were European. Earlier models focused mainly on hip, vertebral, and wrist fracture, but later models included multiple fracture sites (humerus, pelvis, tibia, and other fractures). Modeling techniques have evolved from simple decision trees, through deterministic Markov processes to individual patient simulation models accounting for uncertainty in multiple parameters. Treatment continuance has been increasingly taken into account in the models in the last decade. Models have evolved in their complexity and emphasis, with medical continuance becoming increasingly recognized as a contributor to health and economic outcomes. This evolution may be driven in part by the desire to capture all the important differentiating characteristics of medications under scrutiny, as well as the advancement in epidemiological data relevant to osteoporosis fractures.

  16. Multiple linear regression and regression with time series error models in forecasting PM10 concentrations in Peninsular Malaysia.

    PubMed

    Ng, Kar Yong; Awang, Norhashidah

    2018-01-06

    Frequent haze occurrences in Malaysia have made the management of PM 10 (particulate matter with aerodynamic less than 10 μm) pollution a critical task. This requires knowledge on factors associating with PM 10 variation and good forecast of PM 10 concentrations. Hence, this paper demonstrates the prediction of 1-day-ahead daily average PM 10 concentrations based on predictor variables including meteorological parameters and gaseous pollutants. Three different models were built. They were multiple linear regression (MLR) model with lagged predictor variables (MLR1), MLR model with lagged predictor variables and PM 10 concentrations (MLR2) and regression with time series error (RTSE) model. The findings revealed that humidity, temperature, wind speed, wind direction, carbon monoxide and ozone were the main factors explaining the PM 10 variation in Peninsular Malaysia. Comparison among the three models showed that MLR2 model was on a same level with RTSE model in terms of forecasting accuracy, while MLR1 model was the worst.

  17. Modeling Biogeochemical Cycling of Heavy Metals in Lake Coeur d'Alene Sediments

    NASA Astrophysics Data System (ADS)

    Sengor, S. S.; Spycher, N.; Belding, E.; Curthoys, K.; Ginn, T. R.

    2005-12-01

    Mining of precious metals since the late 1800's have left Lake Coeur d'Alene (LCdA) sediments heavily enriched with toxic metals, including Cd, Cu, Pb, and Zn. Indigenous microbes however are capable of catalyzing reactions that detoxify the benthic and aqueous lake environments, and thus constitute an important driving component in the biogeochemical cycles of these metals. Here we report on the development of a quantitative model of transport, fate, exposure and effects of toxic compounds on benthic microbial communities at LCdA. First, chemical data from the LCdA area have been compiled from multiple sources to investigate trends in chemical occurrence, as well as to define model boundary conditions. The model is structured as 1-D diffusive reactive transport model to simulate spatial and temporal distribution of metals through the benthic sediments. Inorganic reaction processes included in the model are aqueous speciation, surface complexation, mineral precipitation/dissolution and abiotic redox reactions. Simulations with and without surface complexation are carried out to evaluate the effect of sorption and the conservative behaviour of metals within the benthic sediments under abiotic and purely diffusive transport. The 1-D inorganic diffusive transport model is then coupled to a biotic reaction network including consortium biodegradation kinetics with multiple electron acceptors, product toxicity, and energy partitioning. Multiyear simulations are performed, with water column chemistry established as a boundary condition from extant data, to explore the role of biogeochemical dynamics on benthic fluxes of metals in the long term.

  18. Listening to food workers: Factors that impact proper health and hygiene practice in food service

    PubMed Central

    Clegg Smith, Katherine; Neff, Roni A.; Pollack, Keshia M.; Ensminger, Margaret

    2015-01-01

    Background Foodborne disease is a significant problem worldwide. Research exploring sources of outbreaks indicates a pronounced role for food workers' improper health and hygiene practice. Objective To investigate food workers' perceptions of factors that impact proper food safety practice. Method Interviews with food service workers in Baltimore, MD, USA discussing food safety practices and factors that impact implementation in the workplace. A social ecological model organizes multiple levels of influence on health and hygiene behavior. Results Issues raised by interviewees include factors across the five levels of the social ecological model, and confirm findings from previous work. Interviews also reveal many factors not highlighted in prior work, including issues with food service policies and procedures, working conditions (e.g., pay and benefits), community resources, and state and federal policies. Conclusion Food safety interventions should adopt an ecological orientation that accounts for factors at multiple levels, including workers' social and structural context, that impact food safety practice. PMID:26243248

  19. Modulation by Melatonin of the Pathogenesis of Inflammatory Autoimmune Diseases

    PubMed Central

    Lin, Gu-Jiun; Huang, Shing-Hwa; Chen, Shyi-Jou; Wang, Chih-Hung; Chang, Deh-Ming; Sytwu, Huey-Kang

    2013-01-01

    Melatonin is the major secretory product of the pineal gland during the night and has multiple activities including the regulation of circadian and seasonal rhythms, and antioxidant and anti-inflammatory effects. It also possesses the ability to modulate immune responses by regulation of the T helper 1/2 balance and cytokine production. Autoimmune diseases, which result from the activation of immune cells by autoantigens released from normal tissues, affect around 5% of the population. Activation of autoantigen-specific immune cells leads to subsequent damage of target tissues by these activated cells. Melatonin therapy has been investigated in several animal models of autoimmune disease, where it has a beneficial effect in a number of models excepting rheumatoid arthritis, and has been evaluated in clinical autoimmune diseases including rheumatoid arthritis and ulcerative colitis. This review summarizes and highlights the role and the modulatory effects of melatonin in several inflammatory autoimmune diseases including multiple sclerosis, systemic lupus erythematosus, rheumatoid arthritis, type 1 diabetes mellitus, and inflammatory bowel disease. PMID:23727938

  20. Advanced wireless mobile collaborative sensing network for tactical and strategic missions

    NASA Astrophysics Data System (ADS)

    Xu, Hao

    2017-05-01

    In this paper, an advanced wireless mobile collaborative sensing network will be developed. Through properly combining wireless sensor network, emerging mobile robots and multi-antenna sensing/communication techniques, we could demonstrate superiority of developed sensing network. To be concrete, heterogeneous mobile robots including unmanned aerial vehicle (UAV) and unmanned ground vehicle (UGV) are equipped with multi-model sensors and wireless transceiver antennas. Through real-time collaborative formation control, multiple mobile robots can team the best formation that can provide most accurate sensing results. Also, formatting multiple mobile robots can also construct a multiple-input multiple-output (MIMO) communication system that can provide a reliable and high performance communication network.

  1. Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach

    USGS Publications Warehouse

    van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.

    2015-01-01

    Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0.17 when only one hazard is considered and a score of 0.37 when multiple hazards are considered simultaneously. The LHIs with the most predictive skill were ‘Inundation depth’ and ‘Wave attack’. The Bayesian Network approach has several advantages over the market-standard stage-damage functions: the predictive capacity of multiple indicators can be combined; probabilistic predictions can be obtained, which include uncertainty; and quantitative as well as descriptive information can be used simultaneously.

  2. Maturity of hospital information systems: Most important influencing factors.

    PubMed

    Vidal Carvalho, João; Rocha, Álvaro; Abreu, António

    2017-07-01

    Maturity models facilitate organizational management, including information systems management, with hospital organizations no exception. This article puts forth a study carried out with a group of experts in the field of hospital information systems management with a view to identifying the main influencing factors to be included in an encompassing maturity model for hospital information systems management. This study is based on the results of a literature review, which identified maturity models in the health field and relevant influencing factors. The development of this model is justified to the extent that the available maturity models for the hospital information systems management field reveal multiple limitations, including lack of detail, absence of tools to determine their maturity and lack of characterization for stages of maturity structured by different influencing factors.

  3. Spray combustion model improvement study, 1

    NASA Technical Reports Server (NTRS)

    Chen, C. P.; Kim, Y. M.; Shang, H. M.

    1993-01-01

    This study involves the development of numerical and physical modeling in spray combustion. These modeling efforts are mainly motivated to improve the physical submodels of turbulence, combustion, atomization, dense spray effects, and group vaporization. The present mathematical formulation can be easily implemented in any time-marching multiple pressure correction methodologies such as MAST code. A sequence of validation cases includes the nonevaporating, evaporating and_burnin dense_sprays.

  4. A conceptual model of the role of complexity in the care of patients with multiple chronic conditions.

    PubMed

    Grembowski, David; Schaefer, Judith; Johnson, Karin E; Fischer, Henry; Moore, Susan L; Tai-Seale, Ming; Ricciardi, Richard; Fraser, James R; Miller, Donald; LeRoy, Lisa

    2014-03-01

    Effective healthcare for people with multiple chronic conditions (MCC) is a US priority, but the inherent complexity makes both research and delivery of care particularly challenging. As part of AHRQ Multiple Chronic Conditions Research Network (MCCRN) efforts, the Network developed a conceptual model to guide research in this area. To synthesize methodological and topical issues relevant to MCC patient care into a framework that can improve the delivery of care and advance future research about caring for patients with MCC. The Network synthesized essential constructs for MCC research identified from roundtable discussion, input from expert advisors, and previously published models. The AHRQ MCCRN conceptual model defines complexity as the gap between patient needs and healthcare services, taking into account both the multiple considerations that affect the needs of MCC patients, as well as the contextual factors that influence service delivery. The model reframes processes and outcomes to include not only clinical care quality and experience, but also patient health, well being, and quality of life. The single-condition paradigm for treating needs one-by-one falls apart and highlights the need for care systems to address dynamic patient needs. Defining complexity in terms of the misalignment between patient needs and services offers new insights in how to research and develop solutions to patient care needs.

  5. Quantifying properties of hot and dense QCD matter through systematic model-to-data comparison

    DOE PAGES

    Bernhard, Jonah E.; Marcy, Peter W.; Coleman-Smith, Christopher E.; ...

    2015-05-22

    We systematically compare an event-by-event heavy-ion collision model to data from the CERN Large Hadron Collider. Using a general Bayesian method, we probe multiple model parameters including fundamental quark-gluon plasma properties such as the specific shear viscosity η/s, calibrate the model to optimally reproduce experimental data, and extract quantitative constraints for all parameters simultaneously. Furthermore, the method is universal and easily extensible to other data and collision models.

  6. Soil Cd, Cr, Cu, Ni, Pb and Zn sorption and retention models using SVM: Variable selection and competitive model.

    PubMed

    González Costa, J J; Reigosa, M J; Matías, J M; Covelo, E F

    2017-09-01

    The aim of this study was to model the sorption and retention of Cd, Cu, Ni, Pb and Zn in soils. To that extent, the sorption and retention of these metals were studied and the soil characterization was performed separately. Multiple stepwise regression was used to produce multivariate models with linear techniques and with support vector machines, all of which included 15 explanatory variables characterizing soils. When the R-squared values are represented, two different groups are noticed. Cr, Cu and Pb sorption and retention show a higher R-squared; the most explanatory variables being humified organic matter, Al oxides and, in some cases, cation-exchange capacity (CEC). The other group of metals (Cd, Ni and Zn) shows a lower R-squared, and clays are the most explanatory variables, including a percentage of vermiculite and slime. In some cases, quartz, plagioclase or hematite percentages also show some explanatory capacity. Support Vector Machine (SVM) regression shows that the different models are not as regular as in multiple regression in terms of number of variables, the regression for nickel adsorption being the one with the highest number of variables in its optimal model. On the other hand, there are cases where the most explanatory variables are the same for two metals, as it happens with Cd and Cr adsorption. A similar adsorption mechanism is thus postulated. These patterns of the introduction of variables in the model allow us to create explainability sequences. Those which are the most similar to the selectivity sequences obtained by Covelo (2005) are Mn oxides in multiple regression and change capacity in SVM. Among all the variables, the only one that is explanatory for all the metals after applying the maximum parsimony principle is the percentage of sand in the retention process. In the competitive model arising from the aforementioned sequences, the most intense competitiveness for the adsorption and retention of different metals appears between Cr and Cd, Cu and Zn in multiple regression; and between Cr and Cd in SVM regression. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Animal and in silico models for the study of sarcomeric cardiomyopathies

    PubMed Central

    Duncker, Dirk J.; Bakkers, Jeroen; Brundel, Bianca J.; Robbins, Jeff; Tardiff, Jil C.; Carrier, Lucie

    2015-01-01

    Over the past decade, our understanding of cardiomyopathies has improved dramatically, due to improvements in screening and detection of gene defects in the human genome as well as a variety of novel animal models (mouse, zebrafish, and drosophila) and in silico computational models. These novel experimental tools have created a platform that is highly complementary to the naturally occurring cardiomyopathies in cats and dogs that had been available for some time. A fully integrative approach, which incorporates all these modalities, is likely required for significant steps forward in understanding the molecular underpinnings and pathogenesis of cardiomyopathies. Finally, novel technologies, including CRISPR/Cas9, which have already been proved to work in zebrafish, are currently being employed to engineer sarcomeric cardiomyopathy in larger animals, including pigs and non-human primates. In the mouse, the increased speed with which these techniques can be employed to engineer precise ‘knock-in’ models that previously took years to make via multiple rounds of homologous recombination-based gene targeting promises multiple and precise models of human cardiac disease for future study. Such novel genetically engineered animal models recapitulating human sarcomeric protein defects will help bridging the gap to translate therapeutic targets from small animal and in silico models to the human patient with sarcomeric cardiomyopathy. PMID:25600962

  8. A European model and case studies for aggregate exposure assessment of pesticides.

    PubMed

    Kennedy, Marc C; Glass, C Richard; Bokkers, Bas; Hart, Andy D M; Hamey, Paul Y; Kruisselbrink, Johannes W; de Boer, Waldo J; van der Voet, Hilko; Garthwaite, David G; van Klaveren, Jacob D

    2015-05-01

    Exposures to plant protection products (PPPs) are assessed using risk analysis methods to protect public health. Traditionally, single sources, such as food or individual occupational sources, have been addressed. In reality, individuals can be exposed simultaneously to multiple sources. Improved regulation therefore requires the development of new tools for estimating the population distribution of exposures aggregated within an individual. A new aggregate model is described, which allows individual users to include as much, or as little, information as is available or relevant for their particular scenario. Depending on the inputs provided by the user, the outputs can range from simple deterministic values through to probabilistic analyses including characterisations of variability and uncertainty. Exposures can be calculated for multiple compounds, routes and sources of exposure. The aggregate model links to the cumulative dietary exposure model developed in parallel and is implemented in the web-based software tool MCRA. Case studies are presented to illustrate the potential of this model, with inputs drawn from existing European data sources and models. These cover exposures to UK arable spray operators, Italian vineyard spray operators, Netherlands users of a consumer spray and UK bystanders/residents. The model could also be adapted to handle non-PPP compounds. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  9. A Steady State and Quasi-Steady Interface Between the Generalized Fluid System Simulation Program and the SINDA/G Thermal Analysis Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Majumdar, Alok; Tiller, Bruce

    2001-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasisteady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  10. System, method and apparatus for generating phrases from a database

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A phrase generation is a method of generating sequences of terms, such as phrases, that may occur within a database of subsets containing sequences of terms, such as text. A database is provided and a relational model of the database is created. A query is then input. The query includes a term or a sequence of terms or multiple individual terms or multiple sequences of terms or combinations thereof. Next, several sequences of terms that are contextually related to the query are assembled from contextual relations in the model of the database. The sequences of terms are then sorted and output. Phrase generation can also be an iterative process used to produce sequences of terms from a relational model of a database.

  11. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system has been created. The qualitative model describes the effects of seal failures on the system steady-state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  12. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system was created. The qualitative model describes the effects of seal failures on the system steady state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  13. Predictive Seagrass Habitat Model

    EPA Science Inventory

    Restoration of ecosystem services provided by seagrass habitats in estuaries requires a firm understanding of the modes of action of multiple interacting stressors including nutrients, climate change, coastal land-use change, and habitat modification. We explored the application...

  14. Optimum Vehicle Component Integration with InVeST (Integrated Vehicle Simulation Testbed)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, W; Paddack, E; Aceves, S

    2001-12-27

    We have developed an Integrated Vehicle Simulation Testbed (InVeST). InVeST is based on the concept of Co-simulation, and it allows the development of virtual vehicles that can be analyzed and optimized as an overall integrated system. The virtual vehicle is defined by selecting different vehicle components from a component library. Vehicle component models can be written in multiple programming languages running on different computer platforms. At the same time, InVeST provides full protection for proprietary models. Co-simulation is a cost-effective alternative to competing methodologies, such as developing a translator or selecting a single programming language for all vehicle components. InVeSTmore » has been recently demonstrated using a transmission model and a transmission controller model. The transmission model was written in SABER and ran on a Sun/Solaris workstation, while the transmission controller was written in MATRIXx and ran on a PC running Windows NT. The demonstration was successfully performed. Future plans include the applicability of Co-simulation and InVeST to analysis and optimization of multiple complex systems, including those of Intelligent Transportation Systems.« less

  15. Mixed raster content (MRC) model for compound image compression

    NASA Astrophysics Data System (ADS)

    de Queiroz, Ricardo L.; Buckley, Robert R.; Xu, Ming

    1998-12-01

    This paper will describe the Mixed Raster Content (MRC) method for compressing compound images, containing both binary test and continuous-tone images. A single compression algorithm that simultaneously meets the requirements for both text and image compression has been elusive. MRC takes a different approach. Rather than using a single algorithm, MRC uses a multi-layered imaging model for representing the results of multiple compression algorithms, including ones developed specifically for text and for images. As a result, MRC can combine the best of existing or new compression algorithms and offer different quality-compression ratio tradeoffs. The algorithms used by MRC set the lower bound on its compression performance. Compared to existing algorithms, MRC has some image-processing overhead to manage multiple algorithms and the imaging model. This paper will develop the rationale for the MRC approach by describing the multi-layered imaging model in light of a rate-distortion trade-off. Results will be presented comparing images compressed using MRC, JPEG and state-of-the-art wavelet algorithms such as SPIHT. MRC has been approved or proposed as an architectural model for several standards, including ITU Color Fax, IETF Internet Fax, and JPEG 2000.

  16. Generation of animation sequences of three dimensional models

    NASA Technical Reports Server (NTRS)

    Poi, Sharon (Inventor); Bell, Brad N. (Inventor)

    1990-01-01

    The invention is directed toward a method and apparatus for generating an animated sequence through the movement of three-dimensional graphical models. A plurality of pre-defined graphical models are stored and manipulated in response to interactive commands or by means of a pre-defined command file. The models may be combined as part of a hierarchical structure to represent physical systems without need to create a separate model which represents the combined system. System motion is simulated through the introduction of translation, rotation and scaling parameters upon a model within the system. The motion is then transmitted down through the system hierarchy of models in accordance with hierarchical definitions and joint movement limitations. The present invention also calls for a method of editing hierarchical structure in response to interactive commands or a command file such that a model may be included, deleted, copied or moved within multiple system model hierarchies. The present invention also calls for the definition of multiple viewpoints or cameras which may exist as part of a system hierarchy or as an independent camera. The simulated movement of the models and systems is graphically displayed on a monitor and a frame is recorded by means of a video controller. Multiple movement and hierarchy manipulations are then recorded as a sequence of frames which may be played back as an animation sequence on a video cassette recorder.

  17. [Patient-centred prescription model to improve adequate prescription and therapeutic adherence in patients with multiple disorders].

    PubMed

    Espaulella-Panicot, Joan; Molist-Brunet, Núria; Sevilla-Sánchez, Daniel; González-Bueno, Javier; Amblàs-Novellas, Jordi; Solà-Bonada, Núria; Codina-Jané, Carles

    Patients with multiple disorders and on multiple medication are often associated with clinical complexity, defined as a situation of uncertainty conditioned by difficulties in establishing a situational diagnosis and decision-making. The patient-centred care approach in this population group seems to be one of the best therapeutic options. In this context, the preparation of an individualised therapeutic plan is the most relevant practical element, where the pharmacological plan maintains an important role. There has recently been a significant increase in knowledge in the area of adequacy of prescription and adherence. In this context, we must find a model must be found that incorporates this knowledge into clinical practice by the professionals. Person-centred prescription is a medication review model that includes different strategies in a single intervention. It is performed by a multidisciplinary team, and allows them to adapt the pharmacological plan of patients with clinical complexity. Copyright © 2017 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. Method, system, and computer-readable medium for determining performance characteristics of an object undergoing one or more arbitrary aging conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gering, Kevin L.

    A method, system, and computer-readable medium are described for characterizing performance loss of an object undergoing an arbitrary aging condition. Baseline aging data may be collected from the object for at least one known baseline aging condition over time, determining baseline multiple sigmoid model parameters from the baseline data, and performance loss of the object may be determined over time through multiple sigmoid model parameters associated with the object undergoing the arbitrary aging condition using a differential deviation-from-baseline approach from the baseline multiple sigmoid model parameters. The system may include an object, monitoring hardware configured to sample performance characteristics ofmore » the object, and a processor coupled to the monitoring hardware. The processor is configured to determine performance loss for the arbitrary aging condition from a comparison of the performance characteristics of the object deviating from baseline performance characteristics associated with a baseline aging condition.« less

  19. Predictors of transitions from single to multiple job holding: Results of a longitudinal study among employees aged 45-64 in the Netherlands.

    PubMed

    Bouwhuis, Stef; Geuskens, Goedele A; Boot, Cécile R L; Bongers, Paulien M; van der Beek, Allard J

    2017-08-01

    To construct prediction models for transitions to combination multiple job holding (MJH) (multiple jobs as an employee) and hybrid MJH (being an employee and self-employed), among employees aged 45-64. A total of 5187 employees in the Netherlands completed online questionnaires annually between 2010 and 2013. We applied logistic regression analyses with a backward elimination strategy to construct prediction models. Transitions to combination MJH and hybrid MJH were best predicted by a combination of factors including: demographics, health and mastery, work characteristics, work history, skills and knowledge, social factors, and financial factors. Not having a permanent contract and a poor household financial situation predicted both transitions. Some predictors only predicted combination MJH, e.g., working part-time, or hybrid MJH, e.g., work-home interference. A wide variety of factors predict combination MJH and/or hybrid MJH. The prediction model approach allowed for the identification of predictors that have not been previously studied. © 2017 Wiley Periodicals, Inc.

  20. Ship Detection Based on Multiple Features in Random Forest Model for Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Li, N.; Ding, L.; Zhao, H.; Shi, J.; Wang, D.; Gong, X.

    2018-04-01

    A novel method for detecting ships which aim to make full use of both the spatial and spectral information from hyperspectral images is proposed. Firstly, the band which is high signal-noise ratio in the range of near infrared or short-wave infrared spectrum, is used to segment land and sea on Otsu threshold segmentation method. Secondly, multiple features that include spectral and texture features are extracted from hyperspectral images. Principal components analysis (PCA) is used to extract spectral features, the Grey Level Co-occurrence Matrix (GLCM) is used to extract texture features. Finally, Random Forest (RF) model is introduced to detect ships based on the extracted features. To illustrate the effectiveness of the method, we carry out experiments over the EO-1 data by comparing single feature and different multiple features. Compared with the traditional single feature method and Support Vector Machine (SVM) model, the proposed method can stably achieve the target detection of ships under complex background and can effectively improve the detection accuracy of ships.

  1. Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1992-01-01

    Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)

  2. Study of optoelectronic switch for satellite-switched time-division multiple access

    NASA Technical Reports Server (NTRS)

    Su, Shing-Fong; Jou, Liz; Lenart, Joe

    1987-01-01

    The use of optoelectronic switching for satellite switched time division multiple access will improve the isolation and reduce the crosstalk of an IF switch matrix. The results are presented of a study on optoelectronic switching. Tasks include literature search, system requirements study, candidate switching architecture analysis, and switch model optimization. The results show that the power divided and crossbar switching architectures are good candidates for an IF switch matrix.

  3. Development of in Vivo Biomarkers for Progressive Tau Pathology after Traumatic Brain Injury

    DTIC Science & Technology

    2015-02-01

    Athletes in contact sports who have sustained multiple concussive traumatic brain injuries are at high risk for delayed, progressive neurological and...11 or ‘punch drunk’ syndrome 9, 12. US military personnel 13, 14 and others who have sustained multiple concussive traumatic brain injuries 15-17...To date, none of the attempts to model progressive tau pathology after repetitive concussive TBI in mice has been optimal. Ongoing efforts include

  4. Simulation of multiple scattering in a medium with an anisotropic scattering pattern

    NASA Astrophysics Data System (ADS)

    Kuzmin, V. L.; Val'kov, A. Yu.

    2017-03-01

    Multiple backscattering from layers with various thicknesses, including the case of half-space, is numerically simulated and a comparative analysis is performed for systems with the anisotropy of scattering described by the Henyey-Greenstein and Rayleigh-Gans phase functions. It is shown that the intensity of backscattering depends on the form of the phase function; the difference between the intensities obtained within the two models increases with anisotropy.

  5. Relative efficiency of joint-model and full-conditional-specification multiple imputation when conditional models are compatible: The general location model.

    PubMed

    Seaman, Shaun R; Hughes, Rachael A

    2018-06-01

    Estimating the parameters of a regression model of interest is complicated by missing data on the variables in that model. Multiple imputation is commonly used to handle these missing data. Joint model multiple imputation and full-conditional specification multiple imputation are known to yield imputed data with the same asymptotic distribution when the conditional models of full-conditional specification are compatible with that joint model. We show that this asymptotic equivalence of imputation distributions does not imply that joint model multiple imputation and full-conditional specification multiple imputation will also yield asymptotically equally efficient inference about the parameters of the model of interest, nor that they will be equally robust to misspecification of the joint model. When the conditional models used by full-conditional specification multiple imputation are linear, logistic and multinomial regressions, these are compatible with a restricted general location joint model. We show that multiple imputation using the restricted general location joint model can be substantially more asymptotically efficient than full-conditional specification multiple imputation, but this typically requires very strong associations between variables. When associations are weaker, the efficiency gain is small. Moreover, full-conditional specification multiple imputation is shown to be potentially much more robust than joint model multiple imputation using the restricted general location model to mispecification of that model when there is substantial missingness in the outcome variable.

  6. Examination of various turbulence models for application in liquid rocket thrust chambers

    NASA Technical Reports Server (NTRS)

    Hung, R. J.

    1991-01-01

    There is a large variety of turbulence models available. These models include direct numerical simulation, large eddy simulation, Reynolds stress/flux model, zero equation model, one equation model, two equation k-epsilon model, multiple-scale model, etc. Each turbulence model contains different physical assumptions and requirements. The natures of turbulence are randomness, irregularity, diffusivity and dissipation. The capabilities of the turbulence models, including physical strength, weakness, limitations, as well as numerical and computational considerations, are reviewed. Recommendations are made for the potential application of a turbulence model in thrust chamber and performance prediction programs. The full Reynolds stress model is recommended. In a workshop, specifically called for the assessment of turbulence models for applications in liquid rocket thrust chambers, most of the experts present were also in favor of the recommendation of the Reynolds stress model.

  7. Cometary atmospheres: Modeling the spatial distribution of observed neutral radicals

    NASA Technical Reports Server (NTRS)

    Combi, M. R.

    1985-01-01

    Progress on modeling the spatial distributions of cometary radicals is described. The Monte Carlo particle-trajectory model was generalized to include the full time dependencies of initial comet expansion velocities, nucleus vaporization rates, photochemical lifetimes and photon emission rates which enter the problem through the comet's changing heliocentric distance and velocity. The effect of multiple collisions in the transition zone from collisional coupling to true free flow were also included. Currently available observations of the spatial distributions of the neutral radicals, as well as the latest available photochemical data were re-evaluated. Preliminary exploratory model results testing the effects of various processes on observable spatial distributions are also discussed.

  8. Properties of heuristic search strategies

    NASA Technical Reports Server (NTRS)

    Vanderbrug, G. J.

    1973-01-01

    A directed graph is used to model the search space of a state space representation with single input operators, an AND/OR is used for problem reduction representations, and a theorem proving graph is used for state space representations with multiple input operators. These three graph models and heuristic strategies for searching them are surveyed. The completeness, admissibility, and optimality properties of search strategies which use the evaluation function f = (1 - omega)g = omega(h) are presented and interpreted using a representation of the search process in the plane. The use of multiple output operators to imply dependent successors, and thus obtain a formalism which includes all three types of representations, is discussed.

  9. Line spring model and its applications to part-through crack problems in plates and shells

    NASA Technical Reports Server (NTRS)

    Erdogan, Fazil; Aksel, Bulent

    1988-01-01

    The line spring model is described and extended to cover the problem of interaction of multiple internal and surface cracks in plates and shells. The shape functions for various related crack geometries obtained from the plane strain solution and the results of some multiple crack problems are presented. The problems considered include coplanar surface cracks on the same or opposite sides of a plate, nonsymmetrically located coplanar internal elliptic cracks, and in a very limited way the surface and corner cracks in a plate of finite width and a surface crack in a cylindrical shell with fixed end.

  10. Line Spring Model and Its Applications to Part-Through Crack Problems in Plates and Shells

    NASA Technical Reports Server (NTRS)

    Erdogan, F.; Aksel, B.

    1986-01-01

    The line spring model is described and extended to cover the problem of interaction of multiple internal and surface cracks in plates and shells. The shape functions for various related crack geometries obtained from the plane strain solution and the results of some multiple crack problems are presented. The problems considered include coplanar surface cracks on the same or opposite sides of a plate, nonsymmetrically located coplanar internal elliptic cracks, and in a very limited way the surface and corner cracks in a plate of finite width and a surface crack in a cylindrical shell with fixed end.

  11. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  12. Statistical hadronization with exclusive channels in e +e - annihilation

    DOE PAGES

    Ferroni, L.; Becattini, F.

    2012-01-01

    We present a systematic analysis of exclusive hadronic channels in e +e - collisions at centre-of-mass energies between 2.1 and 2.6 GeV within the statistical hadronization model. Because of the low multiplicities involved, calculations have been carried out in the full microcanonical ensemble, including conservation of energy-momentum, angular momentum, parity, isospin, and all relevant charges. We show that the data is in an overall good agreement with the model for an energy density of about 0.5 GeV/fm 3 and an extra strangeness suppression parameter γ S 0:7, essentially the same values found with fits to inclusive multiplicities at higher energy.

  13. The FORE-SCE model: a practical approach for projecting land cover change using scenario-based modeling

    USGS Publications Warehouse

    Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.

    2007-01-01

    A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.

  14. From puddles to planet: modeling approaches to vector-borne diseases at varying resolution and scale.

    PubMed

    Eckhoff, Philip A; Bever, Caitlin A; Gerardin, Jaline; Wenger, Edward A; Smith, David L

    2015-08-01

    Since the original Ross-Macdonald formulations of vector-borne disease transmission, there has been a broad proliferation of mathematical models of vector-borne disease, but many of these models retain most to all of the simplifying assumptions of the original formulations. Recently, there has been a new expansion of mathematical frameworks that contain explicit representations of the vector life cycle including aquatic stages, multiple vector species, host heterogeneity in biting rate, realistic vector feeding behavior, and spatial heterogeneity. In particular, there are now multiple frameworks for spatially explicit dynamics with movements of vector, host, or both. These frameworks are flexible and powerful, but require additional data to take advantage of these features. For a given question posed, utilizing a range of models with varying complexity and assumptions can provide a deeper understanding of the answers derived from models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Risk assessment of pesticides and other stressors in bees: Principles, data gaps and perspectives from the European Food Safety Authority.

    PubMed

    Rortais, Agnès; Arnold, Gérard; Dorne, Jean-Lou; More, Simon J; Sperandio, Giorgio; Streissl, Franz; Szentes, Csaba; Verdonck, Frank

    2017-06-01

    Current approaches to risk assessment in bees do not take into account co-exposures from multiple stressors. The European Food Safety Authority (EFSA) is deploying resources and efforts to move towards a holistic risk assessment approach of multiple stressors in bees. This paper describes the general principles of pesticide risk assessment in bees, including recent developments at EFSA dealing with risk assessment of single and multiple pesticide residues and biological hazards. The EFSA Guidance Document on the risk assessment of plant protection products in bees highlights the need for the inclusion of an uncertainty analysis, other routes of exposures and multiple stressors such as chemical mixtures and biological agents. The EFSA risk assessment on the survival, spread and establishment of the small hive beetle, Aethina tumida, an invasive alien species, is provided with potential insights for other bee pests such as the Asian hornet, Vespa velutina. Furthermore, data gaps are identified at each step of the risk assessment, and recommendations are made for future research that could be supported under the framework of Horizon 2020. Finally, the recent work conducted at EFSA is presented, under the overarching MUST-B project ("EU efforts towards the development of a holistic approach for the risk assessment on MUltiple STressors in Bees") comprising a toolbox for harmonised data collection under field conditions and a mechanistic model to assess effects from pesticides and other stressors such as biological agents and beekeeping management practices, at the colony level and in a spatially complex landscape. Future perspectives at EFSA include the development of a data model to collate high quality data to calibrate and validate the model to be used as a regulatory tool. Finally, the evidence collected within the framework of MUST-B will support EFSA's activities on the development of a holistic approach to the risk assessment of multiple stressors in bees. In conclusion, EFSA calls for collaborative action at the EU level to establish a common and open access database to serve multiple purposes and different stakeholders. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Identification and Correction of Additive and Multiplicative Spatial Biases in Experimental High-Throughput Screening.

    PubMed

    Mazoure, Bogdan; Caraus, Iurie; Nadon, Robert; Makarenkov, Vladimir

    2018-06-01

    Data generated by high-throughput screening (HTS) technologies are prone to spatial bias. Traditionally, bias correction methods used in HTS assume either a simple additive or, more recently, a simple multiplicative spatial bias model. These models do not, however, always provide an accurate correction of measurements in wells located at the intersection of rows and columns affected by spatial bias. The measurements in these wells depend on the nature of interaction between the involved biases. Here, we propose two novel additive and two novel multiplicative spatial bias models accounting for different types of bias interactions. We describe a statistical procedure that allows for detecting and removing different types of additive and multiplicative spatial biases from multiwell plates. We show how this procedure can be applied by analyzing data generated by the four HTS technologies (homogeneous, microorganism, cell-based, and gene expression HTS), the three high-content screening (HCS) technologies (area, intensity, and cell-count HCS), and the only small-molecule microarray technology available in the ChemBank small-molecule screening database. The proposed methods are included in the AssayCorrector program, implemented in R, and available on CRAN.

  17. Less is more? Assessing the validity of the ICD-11 model of PTSD across multiple trauma samples

    PubMed Central

    Hansen, Maj; Hyland, Philip; Armour, Cherie; Shevlin, Mark; Elklit, Ask

    2015-01-01

    Background In the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the symptom profile of posttraumatic stress disorder (PTSD) was expanded to include 20 symptoms. An alternative model of PTSD is outlined in the proposed 11th edition of the International Classification of Diseases (ICD-11) that includes just six symptoms. Objectives and method The objectives of the current study are: 1) to independently investigate the fit of the ICD-11 model of PTSD, and three DSM-5-based models of PTSD, across seven different trauma samples (N=3,746) using confirmatory factor analysis; 2) to assess the concurrent validity of the ICD-11 model of PTSD; and 3) to determine if there are significant differences in diagnostic rates between the ICD-11 guidelines and the DSM-5 criteria. Results The ICD-11 model of PTSD was found to provide excellent model fit in six of the seven trauma samples, and tests of factorial invariance showed that the model performs equally well for males and females. DSM-5 models provided poor fit of the data. Concurrent validity was established as the ICD-11 PTSD factors were all moderately to strongly correlated with scores of depression, anxiety, dissociation, and aggression. Levels of association were similar for ICD-11 and DSM-5 suggesting that explanatory power is not affected due to the limited number of items included in the ICD-11 model. Diagnostic rates were significantly lower according to ICD-11 guidelines compared to the DSM-5 criteria. Conclusions The proposed factor structure of the ICD-11 model of PTSD appears valid across multiple trauma types, possesses good concurrent validity, and is more stringent in terms of diagnosis compared to the DSM-5 criteria. PMID:26450830

  18. Less is more? Assessing the validity of the ICD-11 model of PTSD across multiple trauma samples.

    PubMed

    Hansen, Maj; Hyland, Philip; Armour, Cherie; Shevlin, Mark; Elklit, Ask

    2015-01-01

    In the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), the symptom profile of posttraumatic stress disorder (PTSD) was expanded to include 20 symptoms. An alternative model of PTSD is outlined in the proposed 11th edition of the International Classification of Diseases (ICD-11) that includes just six symptoms. The objectives of the current study are: 1) to independently investigate the fit of the ICD-11 model of PTSD, and three DSM-5-based models of PTSD, across seven different trauma samples (N=3,746) using confirmatory factor analysis; 2) to assess the concurrent validity of the ICD-11 model of PTSD; and 3) to determine if there are significant differences in diagnostic rates between the ICD-11 guidelines and the DSM-5 criteria. The ICD-11 model of PTSD was found to provide excellent model fit in six of the seven trauma samples, and tests of factorial invariance showed that the model performs equally well for males and females. DSM-5 models provided poor fit of the data. Concurrent validity was established as the ICD-11 PTSD factors were all moderately to strongly correlated with scores of depression, anxiety, dissociation, and aggression. Levels of association were similar for ICD-11 and DSM-5 suggesting that explanatory power is not affected due to the limited number of items included in the ICD-11 model. Diagnostic rates were significantly lower according to ICD-11 guidelines compared to the DSM-5 criteria. The proposed factor structure of the ICD-11 model of PTSD appears valid across multiple trauma types, possesses good concurrent validity, and is more stringent in terms of diagnosis compared to the DSM-5 criteria.

  19. Multi-Layer Identification of Highly-Potent ABCA1 Up-Regulators Targeting LXRβ Using Multiple QSAR Modeling, Structural Similarity Analysis, and Molecular Docking.

    PubMed

    Chen, Meimei; Yang, Fafu; Kang, Jie; Yang, Xuemei; Lai, Xinmei; Gao, Yuxing

    2016-11-29

    In this study, in silico approaches, including multiple QSAR modeling, structural similarity analysis, and molecular docking, were applied to develop QSAR classification models as a fast screening tool for identifying highly-potent ABCA1 up-regulators targeting LXRβ based on a series of new flavonoids. Initially, four modeling approaches, including linear discriminant analysis, support vector machine, radial basis function neural network, and classification and regression trees, were applied to construct different QSAR classification models. The statistics results indicated that these four kinds of QSAR models were powerful tools for screening highly potent ABCA1 up-regulators. Then, a consensus QSAR model was developed by combining the predictions from these four models. To discover new ABCA1 up-regulators at maximum accuracy, the compounds in the ZINC database that fulfilled the requirement of structural similarity of 0.7 compared to known potent ABCA1 up-regulator were subjected to the consensus QSAR model, which led to the discovery of 50 compounds. Finally, they were docked into the LXRβ binding site to understand their role in up-regulating ABCA1 expression. The excellent binding modes and docking scores of 10 hit compounds suggested they were highly-potent ABCA1 up-regulators targeting LXRβ. Overall, this study provided an effective strategy to discover highly potent ABCA1 up-regulators.

  20. Coastal Evolution Modeling at Multiple Scales in Regional Sediment Management Applications

    DTIC Science & Technology

    2011-05-01

    run-up height (including setup), ∆h is the surge level (including tide elevation relative to mean sea level (MSL)); zD is the dune toe elevation...interactive shoreline, dune , and inlet evolution, on the scale of hundreds of years, a regional and long-term perspective. The regional model...side by subscript r. Dune Erosion As waves run up on the beach and reach the foot of the dune , the dune will be subject to erosion. If it is assumed

  1. Effects of land cover, topography, and built structure on seasonal water quality at multiple spatial scales.

    PubMed

    Pratt, Bethany; Chang, Heejun

    2012-03-30

    The relationship among land cover, topography, built structure and stream water quality in the Portland Metro region of Oregon and Clark County, Washington areas, USA, is analyzed using ordinary least squares (OLS) and geographically weighted (GWR) multiple regression models. Two scales of analysis, a sectional watershed and a buffer, offered a local and a global investigation of the sources of stream pollutants. Model accuracy, measured by R(2) values, fluctuated according to the scale, season, and regression method used. While most wet season water quality parameters are associated with urban land covers, most dry season water quality parameters are related topographic features such as elevation and slope. GWR models, which take into consideration local relations of spatial autocorrelation, had stronger results than OLS regression models. In the multiple regression models, sectioned watershed results were consistently better than the sectioned buffer results, except for dry season pH and stream temperature parameters. This suggests that while riparian land cover does have an effect on water quality, a wider contributing area needs to be included in order to account for distant sources of pollutants. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Recovering hidden diagonal structures via non-negative matrix factorization with multiple constraints.

    PubMed

    Yang, Xi; Han, Guoqiang; Cai, Hongmin; Song, Yan

    2017-03-31

    Revealing data with intrinsically diagonal block structures is particularly useful for analyzing groups of highly correlated variables. Earlier researches based on non-negative matrix factorization (NMF) have been shown to be effective in representing such data by decomposing the observed data into two factors, where one factor is considered to be the feature and the other the expansion loading from a linear algebra perspective. If the data are sampled from multiple independent subspaces, the loading factor would possess a diagonal structure under an ideal matrix decomposition. However, the standard NMF method and its variants have not been reported to exploit this type of data via direct estimation. To address this issue, a non-negative matrix factorization with multiple constraints model is proposed in this paper. The constraints include an sparsity norm on the feature matrix and a total variational norm on each column of the loading matrix. The proposed model is shown to be capable of efficiently recovering diagonal block structures hidden in observed samples. An efficient numerical algorithm using the alternating direction method of multipliers model is proposed for optimizing the new model. Compared with several benchmark models, the proposed method performs robustly and effectively for simulated and real biological data.

  3. Modeling colony collapse disorder in honeybees as a contagion.

    PubMed

    Kribs-Zaleta, Christopher M; Mitchell, Christopher

    2014-12-01

    Honeybee pollination accounts annually for over $14 billion in United States agriculture alone. Within the past decade there has been a mysterious mass die-off of honeybees, an estimated 10 million beehives and sometimes as much as 90% of an apiary. There is still no consensus on what causes this phenomenon, called Colony Collapse Disorder, or CCD. Several mathematical models have studied CCD by only focusing on infection dynamics. We created a model to account for both healthy hive dynamics and hive extinction due to CCD, modeling CCD via a transmissible infection brought to the hive by foragers. The system of three ordinary differential equations accounts for multiple hive population behaviors including Allee effects and colony collapse. Numerical analysis leads to critical hive sizes for multiple scenarios and highlights the role of accelerated forager recruitment in emptying hives during colony collapse.

  4. RF verification tasks underway at the Harris Corporation for multiple aperture reflector system

    NASA Technical Reports Server (NTRS)

    Gutwein, T. A.

    1982-01-01

    Mesh effects on gain and patterns and adjacent aperture coupling effects for "pie" and circular apertures are discussed. Wire effects for Harris model with Langley scale model results included for assessing D/lamda effects, and wire effects with adjacent aperture coupling were determined. Reflector surface distortion effects (pillows and manufacturing roughness) were studied.

  5. A Multi-Domain Model of Risk Factors for ODD Symptoms in a Community Sample of 4-Year-Olds

    ERIC Educational Resources Information Center

    Lavigne, John V.; Gouze, Karen R.; Hopkins, Joyce; Bryant, Fred B.; LeBailly, Susan A.

    2012-01-01

    Few studies have been designed to assess the pathways by which risk factors are associated with symptoms of psychopathology across multiple domains, including contextual factors, parental depression, parenting, and child characteristics. The present study examines a cross-sectional model of risk factors for symptoms of Oppositional Defiant…

  6. Using Modeling and Rehearsal to Teach Fire Safety to Children with Autism

    ERIC Educational Resources Information Center

    Garcia, David; Dukes, Charles; Brady, Michael P.; Scott, Jack; Wilson, Cynthia L.

    2016-01-01

    We evaluated the efficacy of an instructional procedure to teach young children with autism to evacuate settings and notify an adult during a fire alarm. A multiple baseline design across children showed that an intervention that included modeling, rehearsal, and praise was effective in teaching fire safety skills. Safety skills generalized to…

  7. SLA Negotiation for VO Formation

    NASA Astrophysics Data System (ADS)

    Paurobally, Shamimabi

    Resource management systems are changing from localized resources and services towards virtual organizations (VOs) sharing millions of heterogeneous resources across multiple organizations and domains. The virtual organizations and usage models include a variety of owners and consumers with different usage, access policies, cost models, varying loads, requirements and availability. The stakeholders have private utility functions that must be satisfied and possibly maximized.

  8. Model Drawing Strategy for Fraction Word Problem Solving of Fourth-Grade Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Sharp, Emily; Shih Dennis, Minyi

    2017-01-01

    This study used a multiple probe across participants design to examine the effects of a model drawing strategy (MDS) intervention package on fraction comparing and ordering word problem-solving performance of three Grade 4 students. MDS is a form of cognitive strategy instruction for teaching word problem solving that includes explicit instruction…

  9. Bellows flow-induced vibrations

    NASA Technical Reports Server (NTRS)

    Tygielski, P. J.; Smyly, H. M.; Gerlach, C. R.

    1983-01-01

    The bellows flow excitation mechanism and results of comprehensive test program are summarized. The analytical model for predicting bellows flow induced stress is refined. The model includes the effects of an upstream elbow, arbitrary geometry, and multiple piles. A refined computer code for predicting flow induced stress is described which allows life prediction if a material S-N diagram is available.

  10. Time dependent emission line profiles in the radially streaming particle model of Seyfert galaxy nuclei and quasi-stellar objects

    NASA Technical Reports Server (NTRS)

    Hubbard, R.

    1974-01-01

    The radially-streaming particle model for broad quasar and Seyfert galaxy emission features is modified to include sources of time dependence. The results are suggestive of reported observations of multiple components, variability, and transient features in the wings of Seyfert and quasi-stellar emission lines.

  11. Parallel and Preemptable Dynamically Dimensioned Search Algorithms for Single and Multi-objective Optimization in Water Resources

    NASA Astrophysics Data System (ADS)

    Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.

    2015-12-01

    We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.

  12. Analytical model for investigation of interior noise characteristics in aircraft with multiple propellers including synchrophasing

    NASA Technical Reports Server (NTRS)

    Fuller, C. R.

    1986-01-01

    A simplified analytical model of transmission of noise into the interior of propeller-driven aircraft has been developed. The analysis includes directivity and relative phase effects of the propeller noise sources, and leads to a closed form solution for the coupled motion between the interior and exterior fields via the shell (fuselage) vibrational response. Various situations commonly encountered in considering sound transmission into aircraft fuselages are investigated analytically and the results obtained are compared to measurements in real aircraft. In general the model has proved successful in identifying basic mechanisms behind noise transmission phenomena.

  13. Systematic Review of Programs Treating High-Need and High-Cost People With Multiple Chronic Diseases or Disabilities in the United States, 2008–2014

    PubMed Central

    Bleich, Sara N.; Sherrod, Cheryl; Chiang, Anne; Boyd, Cynthia; Wolff, Jennifer; DuGoff, Eva; Salzberg, Claudia; Anderson, Keely; Leff, Bruce

    2015-01-01

    Introduction Finding ways to provide better and less expensive health care for people with multiple chronic conditions or disability is a pressing concern. The purpose of this systematic review was to evaluate different approaches for caring for this high-need and high-cost population. Methods We searched Medline for articles published from May 31, 2008, through June 10, 2014, for relevant studies. Articles were considered eligible for this review if they met the following criteria: included people with multiple chronic conditions (behavioral or mental health) or disabilities (2 or more); addressed 1 or more of clinical outcomes, health care use and spending, or patient satisfaction; and compared results from an intervention group with a comparison group or baseline measurements. We extracted information on program characteristics, participant characteristics, and significant (positive and negative) clinical findings, patient satisfaction, and health care use outcomes. For each outcome, the number of significant and positive results was tabulated. Results Twenty-seven studies were included across 5 models of care. Of the 3 studies reporting patient satisfaction outcomes, 2 reported significant improvements; both were randomized controlled trials (RCTs). Of the 14 studies reporting clinical outcomes, 12 reported improvements (8 were RCTs). Of the 13 studies reporting health care use and spending outcomes, 12 reported significant improvements (2 were RCTs). Two models of care — care and case management and disease management — reported improvements in all 3 outcomes. For care and case management models, most improvements were related to health care use. For the disease management models, most improvements were related to clinical outcomes. Conclusions Care and case management as well as disease management may be promising models of care for people with multiple chronic conditions or disabilities. More research and consistent methods are needed to understand the most appropriate care for these high-need and high-cost patients. PMID:26564013

  14. Systematic Review of Programs Treating High-Need and High-Cost People With Multiple Chronic Diseases or Disabilities in the United States, 2008-2014.

    PubMed

    Bleich, Sara N; Sherrod, Cheryl; Chiang, Anne; Boyd, Cynthia; Wolff, Jennifer; DuGoff, Eva; Chang, Eva; Salzberg, Claudia; Anderson, Keely; Leff, Bruce; Anderson, Gerard

    2015-11-12

    Finding ways to provide better and less expensive health care for people with multiple chronic conditions or disability is a pressing concern. The purpose of this systematic review was to evaluate different approaches for caring for this high-need and high-cost population. We searched Medline for articles published from May 31, 2008, through June 10, 2014, for relevant studies. Articles were considered eligible for this review if they met the following criteria: included people with multiple chronic conditions (behavioral or mental health) or disabilities (2 or more); addressed 1 or more of clinical outcomes, health care use and spending, or patient satisfaction; and compared results from an intervention group with a comparison group or baseline measurements. We extracted information on program characteristics, participant characteristics, and significant (positive and negative) clinical findings, patient satisfaction, and health care use outcomes. For each outcome, the number of significant and positive results was tabulated. Twenty-seven studies were included across 5 models of care. Of the 3 studies reporting patient satisfaction outcomes, 2 reported significant improvements; both were randomized controlled trials (RCTs). Of the 14 studies reporting clinical outcomes, 12 reported improvements (8 were RCTs). Of the 13 studies reporting health care use and spending outcomes, 12 reported significant improvements (2 were RCTs). Two models of care - care and case management and disease management - reported improvements in all 3 outcomes. For care and case management models, most improvements were related to health care use. For the disease management models, most improvements were related to clinical outcomes. Care and case management as well as disease management may be promising models of care for people with multiple chronic conditions or disabilities. More research and consistent methods are needed to understand the most appropriate care for these high-need and high-cost patients.

  15. I. Excluded volume effects in Ising cluster distributions and nuclear multifragmentation. II. Multiple-chance effects in alpha-particle evaporation

    NASA Astrophysics Data System (ADS)

    Breus, Dimitry Eugene

    In Part I, geometric clusters of the Ising model are studied as possible model clusters for nuclear multifragmentation. These clusters may not be considered as non-interacting (ideal gas) due to excluded volume effect which predominantly is the artifact of the cluster's finite size. Interaction significantly complicates the use of clusters in the analysis of thermodynamic systems. Stillinger's theory is used as a basis for the analysis, which within the RFL (Reiss, Frisch, Lebowitz) fluid-of-spheres approximation produces a prediction for cluster concentrations well obeyed by geometric clusters of the Ising model. If thermodynamic condition of phase coexistence is met, these concentrations can be incorporated into a differential equation procedure of moderate complexity to elucidate the liquid-vapor phase diagram of the system with cluster interaction included. The drawback of increased complexity is outweighted by the reward of greater accuracy of the phase diagram, as it is demonstrated by the Ising model. A novel nuclear-cluster analysis procedure is developed by modifying Fisher's model to contain cluster interaction and employing the differential equation procedure to obtain thermodynamic variables. With this procedure applied to geometric clusters, the guidelines are developed to look for excluded volume effect in nuclear multifragmentation. In Part II, an explanation is offered for the recently observed oscillations in the energy spectra of alpha-particles emitted from hot compound nuclei. Contrary to what was previously expected, the oscillations are assumed to be caused by the multiple-chance nature of alpha-evaporation. In a semi-empirical fashion this assumption is successfully confirmed by a technique of two-spectra decomposition which treats experimental alpha-spectra as having contributions from at least two independent emitters. Building upon the success of the multiple-chance explanation of the oscillations, Moretto's single-chance evaporation theory is augmented to include multiple-chance emission and tested on experimental data to yield positive results.

  16. The Fruit & Vegetable Screener in the 2000 California Health Interview Survey: Validation Results

    Cancer.gov

    In this study, multiple 24-hour recalls in conjunction with a measurement error model were used to assess validity. The screeners used in the EATS included additional foods and reported portion sizes.

  17. Techniques and resources for storm-scale numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Droegemeier, Kelvin; Grell, Georg; Doyle, James; Soong, Su-Tzai; Skamarock, William; Bacon, David; Staniforth, Andrew; Crook, Andrew; Wilhelmson, Robert

    1993-01-01

    The topics discussed include the following: multiscale application of the 5th-generation PSU/NCAR mesoscale model, the coupling of nonhydrostatic atmospheric and hydrostatic ocean models for air-sea interaction studies; a numerical simulation of cloud formation over complex topography; adaptive grid simulations of convection; an unstructured grid, nonhydrostatic meso/cloud scale model; efficient mesoscale modeling for multiple scales using variable resolution; initialization of cloud-scale models with Doppler radar data; and making effective use of future computing architectures, networks, and visualization software.

  18. Modelling the Cost Effectiveness of Disease-Modifying Treatments for Multiple Sclerosis

    PubMed Central

    Thompson, Joel P.; Abdolahi, Amir; Noyes, Katia

    2013-01-01

    Several cost-effectiveness models of disease-modifying treatments (DMTs) for multiple sclerosis (MS) have been developed for different populations and different countries. Vast differences in the approaches and discrepancies in the results give rise to heated discussions and limit the use of these models. Our main objective is to discuss the methodological challenges in modelling the cost effectiveness of treatments for MS. We conducted a review of published models to describe the approaches taken to date, to identify the key parameters that influence the cost effectiveness of DMTs, and to point out major areas of weakness and uncertainty. Thirty-six published models and analyses were identified. The greatest source of uncertainty is the absence of head-to-head randomized clinical trials. Modellers have used various techniques to compensate, including utilizing extension trials. The use of large observational cohorts in recent studies aids in identifying population-based, ‘real-world’ treatment effects. Major drivers of results include the time horizon modelled and DMT acquisition costs. Model endpoints must target either policy makers (using cost-utility analysis) or clinicians (conducting cost-effectiveness analyses). Lastly, the cost effectiveness of DMTs outside North America and Europe is currently unknown, with the lack of country-specific data as the major limiting factor. We suggest that limited data should not preclude analyses, as models may be built and updated in the future as data become available. Disclosure of modelling methods and assumptions could improve the transferability and applicability of models designed to reflect different healthcare systems. PMID:23640103

  19. Understanding and Changing Food Consumption Behavior Among Children: The Comprehensive Child Consumption Patterns Model.

    PubMed

    Jeffries, Jayne K; Noar, Seth M; Thayer, Linden

    2015-01-01

    Current theoretical models attempting to explain diet-related weight status among children center around three individual-level theories. Alone, these theories fail to explain why children are engaging or not engaging in health-promoting eating behaviors. Our Comprehensive Child Consumption Patterns model takes a comprehensive approach and was developed specifically to help explain child food consumption behavior and addresses many of the theoretical gaps found in previous models, including integration of the life course trajectory, key influencers, perceived behavioral control, and self-regulation. Comprehensive Child Consumption Patterns model highlights multiple levels of the socioecological model to explain child food consumption, illustrating how negative influence at multiple levels can lead to caloric imbalance and contribute to child overweight and obesity. Recognizing the necessity for multi-level and system-based interventions, this model serves as a template for holistic, integrated interventions to improve child eating behavior, ultimately impacting life course health development. © The Author(s) 2015.

  20. Electronic structure of alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehrenreich, H.; Schwartz, L.M.

    1976-01-01

    The description of electronic properties of binary substitutional alloys within the single particle approximation is reviewed. Emphasis is placed on a didactic exposition of the equilibrium properties of the transport and magnetic properties of such alloys. Topics covered include: multiple scattering theory; the single band alloy; formal extensions of the theory; the alloy potential; realistic model state densities; the s-d model; and the muffin tin model. 43 figures, 3 tables, 151 references. (GHT)

  1. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical specificity.

  2. A High-Rate, Single-Crystal Model for Cyclotrimethylene Trinitramine including Phase Transformations and Plastic Slip

    DOE PAGES

    Addessio, Francis L.; Luscher, Darby Jon; Cawkwell, Marc Jon; ...

    2017-05-14

    A continuum model for the high-rate, thermo-mechanical deformation of single-crystal cyclotrimethylene trinitramine (RDX) is developed. The model includes the effects of anisotropy, large deformations, nonlinear thermo-elasticity, phase transformations, and plastic slip. A multiplicative decomposition of the deformation gradient is used. The volumetric elastic component of the deformation is accounted for through a free-energy based equation of state for the low- (α) and high-pressure (γ) polymorphs of RDX. Crystal plasticity is addressed using a phenomenological thermal activation model. The deformation gradient for the phase transformation is based on an approach that has been applied to martensitic transformations. Simulations were conducted andmore » compared to high-rate, impact loading of oriented RDX single crystals. The simulations considered multiple orientations of the crystal relative to the direction of shock loading and multiple sample thicknesses. Thirteen slip systems, which were inferred from indentation and x-ray topography, were used to model the α-polymorph. It is shown that by increasing the number of slip systems from the previously considered number of six (6) to thirteen (13) in the α-polymorph, better comparisons with data may be obtained. Simulations of impact conditions in the vicinity of the α- to γ-polymorph transformation (3.8 GPa) are considered. Eleven of the simulations, which were at pressures below the transformation value (3.0 GPa), were compared to experimental data. Comparison of the model was also made with available data for one experiment above the transformation pressure (4.4 GPa). Also, simulations are provided for a nominal pressure of 7.5 GPa to demonstrate the effect of the transformation kinetics on the deformation of a high-rate plate impact problem.« less

  3. A High-Rate, Single-Crystal Model for Cyclotrimethylene Trinitramine including Phase Transformations and Plastic Slip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Addessio, Francis L.; Luscher, Darby Jon; Cawkwell, Marc Jon

    A continuum model for the high-rate, thermo-mechanical deformation of single-crystal cyclotrimethylene trinitramine (RDX) is developed. The model includes the effects of anisotropy, large deformations, nonlinear thermo-elasticity, phase transformations, and plastic slip. A multiplicative decomposition of the deformation gradient is used. The volumetric elastic component of the deformation is accounted for through a free-energy based equation of state for the low- (α) and high-pressure (γ) polymorphs of RDX. Crystal plasticity is addressed using a phenomenological thermal activation model. The deformation gradient for the phase transformation is based on an approach that has been applied to martensitic transformations. Simulations were conducted andmore » compared to high-rate, impact loading of oriented RDX single crystals. The simulations considered multiple orientations of the crystal relative to the direction of shock loading and multiple sample thicknesses. Thirteen slip systems, which were inferred from indentation and x-ray topography, were used to model the α-polymorph. It is shown that by increasing the number of slip systems from the previously considered number of six (6) to thirteen (13) in the α-polymorph, better comparisons with data may be obtained. Simulations of impact conditions in the vicinity of the α- to γ-polymorph transformation (3.8 GPa) are considered. Eleven of the simulations, which were at pressures below the transformation value (3.0 GPa), were compared to experimental data. Comparison of the model was also made with available data for one experiment above the transformation pressure (4.4 GPa). Also, simulations are provided for a nominal pressure of 7.5 GPa to demonstrate the effect of the transformation kinetics on the deformation of a high-rate plate impact problem.« less

  4. [Stature estimation for Sichuan Han nationality female based on X-ray technology with measurement of lumbar vertebrae].

    PubMed

    Qing, Si-han; Chang, Yun-feng; Dong, Xiao-ai; Li, Yuan; Chen, Xiao-gang; Shu, Yong-kang; Deng, Zhen-hua

    2013-10-01

    To establish the mathematical models of stature estimation for Sichuan Han female with measurement of lumbar vertebrae by X-ray to provide essential data for forensic anthropology research. The samples, 206 Sichuan Han females, were divided into three groups including group A, B and C according to the ages. Group A (206 samples) consisted of all ages, group B (116 samples) were 20-45 years old and 90 samples over 45 years old were group C. All the samples were examined lumbar vertebrae through CR technology, including the parameters of five centrums (L1-L5) as anterior border, posterior border and central heights (x1-x15), total central height of lumbar spine (x16), and the real height of every sample. The linear regression analysis was produced using the parameters to establish the mathematical models of stature estimation. Sixty-two trained subjects were tested to verify the accuracy of the mathematical models. The established mathematical models by hypothesis test of linear regression equation model were statistically significant (P<0.05). The standard errors of the equation were 2.982-5.004 cm, while correlation coefficients were 0.370-0.779 and multiple correlation coefficients were 0.533-0.834. The return tests of the highest correlation coefficient and multiple correlation coefficient of each group showed that the highest accuracy of the multiple regression equation, y = 100.33 + 1.489 x3 - 0.548 x6 + 0.772 x9 + 0.058 x12 + 0.645 x15, in group A were 80.6% (+/- lSE) and 100% (+/- 2SE). The established mathematical models in this study could be applied for the stature estimation for Sichuan Han females.

  5. Modeling the lake eutrophication stochastic ecosystem and the research of its stability.

    PubMed

    Wang, Bo; Qi, Qianqian

    2018-06-01

    In the reality, the lake system will be disturbed by stochastic factors including the external and internal factors. By adding the additive noise and the multiplicative noise to the right-hand sides of the model equation, the additive stochastic model and the multiplicative stochastic model are established respectively in order to reduce model errors induced by the absence of some physical processes. For both the two kinds of stochastic ecosystems, the authors studied the bifurcation characteristics with the FPK equation and the Lyapunov exponent method based on the Stratonovich-Khasminiskii stochastic average principle. Results show that, for the additive stochastic model, when control parameter (i.e., nutrient loading rate) falls into the interval [0.388644, 0.66003825], there exists bistability for the ecosystem and the additive noise intensities cannot make the bifurcation point drift. In the region of the bistability, the external stochastic disturbance which is one of the main triggers causing the lake eutrophication, may make the ecosystem unstable and induce a transition. When control parameter (nutrient loading rate) falls into the interval (0,  0.388644) and (0.66003825,  1.0), there only exists a stable equilibrium state and the additive noise intensity could not change it. For the multiplicative stochastic model, there exists more complex bifurcation performance and the multiplicative ecosystem will be broken by the multiplicative noise. Also, the multiplicative noise could reduce the extent of the bistable region, ultimately, the bistable region vanishes for sufficiently large noise. What's more, both the nutrient loading rate and the multiplicative noise will make the ecosystem have a regime shift. On the other hand, for the two kinds of stochastic ecosystems, the authors also discussed the evolution of the ecological variable in detail by using the Four-stage Runge-Kutta method of strong order γ=1.5. The numerical method was found to be capable of effectively explaining the regime shift theory and agreed with the realistic analyze. These conclusions also confirms the two paths for the system to move from one stable state to another proposed by Beisner et al. [3], which may help understand the occurrence mechanism related to the lake eutrophication from the view point of the stochastic model and mathematical analysis. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Oscillations and Multiple Equilibria in Microvascular Blood Flow.

    PubMed

    Karst, Nathaniel J; Storey, Brian D; Geddes, John B

    2015-07-01

    We investigate the existence of oscillatory dynamics and multiple steady-state flow rates in a network with a simple topology and in vivo microvascular blood flow constitutive laws. Unlike many previous analytic studies, we employ the most biologically relevant models of the physical properties of whole blood. Through a combination of analytic and numeric techniques, we predict in a series of two-parameter bifurcation diagrams a range of dynamical behaviors, including multiple equilibria flow configurations, simple oscillations in volumetric flow rate, and multiple coexistent limit cycles at physically realizable parameters. We show that complexity in network topology is not necessary for complex behaviors to arise and that nonlinear rheology, in particular the plasma skimming effect, is sufficient to support oscillatory dynamics similar to those observed in vivo.

  7. Digital micromirror device as amplitude diffuser for multiple-plane phase retrieval

    NASA Astrophysics Data System (ADS)

    Abregana, Timothy Joseph T.; Hermosa, Nathaniel P.; Almoro, Percival F.

    2017-06-01

    Previous implementations of the phase diffuser used in the multiple-plane phase retrieval method included a diffuser glass plate with fixed optical properties or a programmable yet expensive spatial light modulator. Here a model for phase retrieval based on a digital micromirror device as amplitude diffuser is presented. The technique offers programmable, convenient and low-cost amplitude diffuser for a non-stagnating iterative phase retrieval. The technique is demonstrated in the reconstructions of smooth object wavefronts.

  8. Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-09-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Reprint of: Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-11-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Drag reduction of a car model by linear genetic programming control

    NASA Astrophysics Data System (ADS)

    Li, Ruiying; Noack, Bernd R.; Cordier, Laurent; Borée, Jacques; Harambat, Fabien

    2017-08-01

    We investigate open- and closed-loop active control for aerodynamic drag reduction of a car model. Turbulent flow around a blunt-edged Ahmed body is examined at ReH≈ 3× 105 based on body height. The actuation is performed with pulsed jets at all trailing edges (multiple inputs) combined with a Coanda deflection surface. The flow is monitored with 16 pressure sensors distributed at the rear side (multiple outputs). We apply a recently developed model-free control strategy building on genetic programming in Dracopoulos and Kent (Neural Comput Appl 6:214-228, 1997) and Gautier et al. (J Fluid Mech 770:424-441, 2015). The optimized control laws comprise periodic forcing, multi-frequency forcing and sensor-based feedback including also time-history information feedback and combinations thereof. Key enabler is linear genetic programming (LGP) as powerful regression technique for optimizing the multiple-input multiple-output control laws. The proposed LGP control can select the best open- or closed-loop control in an unsupervised manner. Approximately 33% base pressure recovery associated with 22% drag reduction is achieved in all considered classes of control laws. Intriguingly, the feedback actuation emulates periodic high-frequency forcing. In addition, the control identified automatically the only sensor which listens to high-frequency flow components with good signal to noise ratio. Our control strategy is, in principle, applicable to all multiple actuators and sensors experiments.

  11. An integrated modelling framework for neural circuits with multiple neuromodulators.

    PubMed

    Joshi, Alok; Youssofzadeh, Vahab; Vemana, Vinith; McGinnity, T M; Prasad, Girijesh; Wong-Lin, KongFatt

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. © 2017 The Authors.

  12. An integrated modelling framework for neural circuits with multiple neuromodulators

    PubMed Central

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  13. On Latent Growth Models for Composites and Their Constituents.

    PubMed

    Hancock, Gregory R; Mao, Xiulin; Kher, Hemant

    2013-09-01

    Over the last decade and a half, latent growth modeling has become an extremely popular and versatile technique for evaluating longitudinal change and its determinants. Most common among the models applied are those for a single measured variable over time. This model has been extended in a variety of ways, most relevant for the current work being the multidomain and the second-order latent growth models. Whereas the former allows for growth function characteristics to be modeled for multiple outcomes simultaneously, with the degree of growth characteristics' relations assessed within the model (e.g., cross-domain slope factor correlations), the latter models growth in latent outcomes, each of which has effect indicators repeated over time. But what if one has an outcome that is believed to be formative relative to its indicator variables rather than latent? In this case, where the outcome is a composite of multiple constituents, modeling change over time is less straightforward. This article provides analytical and applied details for simultaneously modeling growth in composites and their constituent elements, including a real data example using a general computer self-efficacy questionnaire.

  14. Dynamics of multiple infection and within-host competition by the anther-smut pathogen.

    PubMed

    Hood, M E

    2003-07-01

    Infection of one host by multiple pathogen genotypes represents an important area of pathogen ecology and evolution that lacks a broad empirical foundation. Multiple infection of Silene latifolia by Microbotryum violaceum was studied under field and greenhouse conditions using the natural polymorphism for mating-type bias as a marker. Field transmission resulted in frequent multiple infection, and each stem of the host was infected independently. Within-host diversity of infections equaled that of nearby inoculum sources by the end of the growing season. The number of diseased stems per plant was positively correlated with multiple infection and with overwintering mortality. As a result, multiply infected plants were largely purged from the population, and there was lower within-host pathogen diversity in the second season. However, among plants with a given number of diseased stems, multiply infected plants had a lower risk of overwintering mortality. Following simultaneous and sequential inoculation, strong competitive exclusion was demonstrated, and the first infection had a significant advantage. Dynamics of multiple infection initially included components of coinfection models for virulence evolution and then components of superinfection models after systemic colonization. Furthermore, there was evidence for an advantage of genotypes with mating-type bias, which may contribute to maintenance of this polymorphism in natural populations.

  15. Generalized Effective Medium Theory for Particulate Nanocomposite Materials

    PubMed Central

    Siddiqui, Muhammad Usama; Arif, Abul Fazal M.

    2016-01-01

    The thermal conductivity of particulate nanocomposites is strongly dependent on the size, shape, orientation and dispersion uniformity of the inclusions. To correctly estimate the effective thermal conductivity of the nanocomposite, all these factors should be included in the prediction model. In this paper, the formulation of a generalized effective medium theory for the determination of the effective thermal conductivity of particulate nanocomposites with multiple inclusions is presented. The formulated methodology takes into account all the factors mentioned above and can be used to model nanocomposites with multiple inclusions that are randomly oriented or aligned in a particular direction. The effect of inclusion dispersion non-uniformity is modeled using a two-scale approach. The applications of the formulated effective medium theory are demonstrated using previously published experimental and numerical results for several particulate nanocomposites. PMID:28773817

  16. Interfacing a General Purpose Fluid Network Flow Program with the SINDA/G Thermal Analysis Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Popok, Daniel

    1999-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program Systems Improved Numerical Differencing Analyzer/Gaski (SINDA/G). The flow code, Generalized Fluid System Simulation Program (GFSSP), is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasi-steady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  17. Genomics and the making of yeast biodiversity

    USDA-ARS?s Scientific Manuscript database

    Yeasts are unicellular fungi that do not form fruiting bodies. Although the yeast lifestyle has evolved multiple times, most known species belong to the subphylum Saccharomycotina (syn. Hemiascomycota, hereafter yeasts). This diverse group includes the premier eukaryotic model system, Saccharomyces ...

  18. Evaluating Training.

    ERIC Educational Resources Information Center

    Brethower, Karen S.; Rummler, Geary A.

    1979-01-01

    Presents general systems models (ballistic system, guided system, and adaptive system) and an evaluation matrix to help in examining training evaluation alternatives and in deciding what evaluation is appropriate. Includes some guidelines for conducting evaluation studies using four designs (control group, reversal, multiple baseline, and…

  19. Modelling multi-pulse population dynamics from ultrafast spectroscopy.

    PubMed

    van Wilderen, Luuk J G W; Lincoln, Craig N; van Thor, Jasper J

    2011-03-21

    Current advanced laser, optics and electronics technology allows sensitive recording of molecular dynamics, from single resonance to multi-colour and multi-pulse experiments. Extracting the occurring (bio-) physical relevant pathways via global analysis of experimental data requires a systematic investigation of connectivity schemes. Here we present a Matlab-based toolbox for this purpose. The toolbox has a graphical user interface which facilitates the application of different reaction models to the data to generate the coupled differential equations. Any time-dependent dataset can be analysed to extract time-independent correlations of the observables by using gradient or direct search methods. Specific capabilities (i.e. chirp and instrument response function) for the analysis of ultrafast pump-probe spectroscopic data are included. The inclusion of an extra pulse that interacts with a transient phase can help to disentangle complex interdependent pathways. The modelling of pathways is therefore extended by new theory (which is included in the toolbox) that describes the finite bleach (orientation) effect of single and multiple intense polarised femtosecond pulses on an ensemble of randomly oriented particles in the presence of population decay. For instance, the generally assumed flat-top multimode beam profile is adapted to a more realistic Gaussian shape, exposing the need for several corrections for accurate anisotropy measurements. In addition, the (selective) excitation (photoselection) and anisotropy of populations that interact with single or multiple intense polarised laser pulses is demonstrated as function of power density and beam profile. Using example values of real world experiments it is calculated to what extent this effectively orients the ensemble of particles. Finally, the implementation includes the interaction with multiple pulses in addition to depth averaging in optically dense samples. In summary, we show that mathematical modelling is essential to model and resolve the details of physical behaviour of populations in ultrafast spectroscopy such as pump-probe, pump-dump-probe and pump-repump-probe experiments.

  20. Modelling Multi-Pulse Population Dynamics from Ultrafast Spectroscopy

    PubMed Central

    van Wilderen, Luuk J. G. W.; Lincoln, Craig N.; van Thor, Jasper J.

    2011-01-01

    Current advanced laser, optics and electronics technology allows sensitive recording of molecular dynamics, from single resonance to multi-colour and multi-pulse experiments. Extracting the occurring (bio-) physical relevant pathways via global analysis of experimental data requires a systematic investigation of connectivity schemes. Here we present a Matlab-based toolbox for this purpose. The toolbox has a graphical user interface which facilitates the application of different reaction models to the data to generate the coupled differential equations. Any time-dependent dataset can be analysed to extract time-independent correlations of the observables by using gradient or direct search methods. Specific capabilities (i.e. chirp and instrument response function) for the analysis of ultrafast pump-probe spectroscopic data are included. The inclusion of an extra pulse that interacts with a transient phase can help to disentangle complex interdependent pathways. The modelling of pathways is therefore extended by new theory (which is included in the toolbox) that describes the finite bleach (orientation) effect of single and multiple intense polarised femtosecond pulses on an ensemble of randomly oriented particles in the presence of population decay. For instance, the generally assumed flat-top multimode beam profile is adapted to a more realistic Gaussian shape, exposing the need for several corrections for accurate anisotropy measurements. In addition, the (selective) excitation (photoselection) and anisotropy of populations that interact with single or multiple intense polarised laser pulses is demonstrated as function of power density and beam profile. Using example values of real world experiments it is calculated to what extent this effectively orients the ensemble of particles. Finally, the implementation includes the interaction with multiple pulses in addition to depth averaging in optically dense samples. In summary, we show that mathematical modelling is essential to model and resolve the details of physical behaviour of populations in ultrafast spectroscopy such as pump-probe, pump-dump-probe and pump-repump-probe experiments. PMID:21445294

  1. Multiple soil nutrient competition between plants, microbes, and mineral surfaces: model development, parameterization, and example applications in several tropical forests

    DOE PAGES

    Zhu, Q.; Riley, W. J.; Tang, J.; ...

    2016-01-18

    Soil is a complex system where biotic (e.g., plant roots, micro-organisms) and abiotic (e.g., mineral surfaces) consumers compete for resources necessary for life (e.g., nitrogen, phosphorus). This competition is ecologically significant, since it regulates the dynamics of soil nutrients and controls aboveground plant productivity. Here we develop, calibrate and test a nutrient competition model that accounts for multiple soil nutrients interacting with multiple biotic and abiotic consumers. As applied here for tropical forests, the Nutrient COMpetition model (N-COM) includes three primary soil nutrients (NH 4 +, NO 3 − and PO x; representing the sum of PO 4 3−, HPOmore » 4 2− and H 2PO 4 −) and five potential competitors (plant roots, decomposing microbes, nitrifiers, denitrifiers and mineral surfaces). The competition is formulated with a quasi-steady-state chemical equilibrium approximation to account for substrate (multiple substrates share one consumer) and consumer (multiple consumers compete for one substrate) effects. N-COM successfully reproduced observed soil heterotrophic respiration, N 2O emissions, free phosphorus, sorbed phosphorus and NH 4 + pools at a tropical forest site (Tapajos). The overall model uncertainty was moderately well constrained. Our sensitivity analysis revealed that soil nutrient competition was primarily regulated by consumer–substrate affinity rather than environmental factors such as soil temperature or soil moisture. Our results also imply that under strong nutrient limitation, relative competitiveness depends strongly on the competitor functional traits (affinity and nutrient carrier enzyme abundance). We then applied the N-COM model to analyze field nitrogen and phosphorus perturbation experiments in two tropical forest sites (in Hawaii and Puerto Rico) not used in model development or calibration. Under soil inorganic nitrogen and phosphorus elevated conditions, the model accurately replicated the experimentally observed competition among nutrient consumers. Although we used as many observations as we could obtain, more nutrient addition experiments in tropical systems would greatly benefit model testing and calibration. In summary, the N-COM model provides an ecologically consistent representation of nutrient competition appropriate for land BGC models integrated in Earth System Models.« less

  2. Multiple soil nutrient competition between plants, microbes, and mineral surfaces: model development, parameterization, and example applications in several tropical forests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Q.; Riley, W. J.; Tang, J.

    Soil is a complex system where biotic (e.g., plant roots, micro-organisms) and abiotic (e.g., mineral surfaces) consumers compete for resources necessary for life (e.g., nitrogen, phosphorus). This competition is ecologically significant, since it regulates the dynamics of soil nutrients and controls aboveground plant productivity. Here we develop, calibrate and test a nutrient competition model that accounts for multiple soil nutrients interacting with multiple biotic and abiotic consumers. As applied here for tropical forests, the Nutrient COMpetition model (N-COM) includes three primary soil nutrients (NH 4 +, NO 3 − and PO x; representing the sum of PO 4 3−, HPOmore » 4 2− and H 2PO 4 −) and five potential competitors (plant roots, decomposing microbes, nitrifiers, denitrifiers and mineral surfaces). The competition is formulated with a quasi-steady-state chemical equilibrium approximation to account for substrate (multiple substrates share one consumer) and consumer (multiple consumers compete for one substrate) effects. N-COM successfully reproduced observed soil heterotrophic respiration, N 2O emissions, free phosphorus, sorbed phosphorus and NH 4 + pools at a tropical forest site (Tapajos). The overall model uncertainty was moderately well constrained. Our sensitivity analysis revealed that soil nutrient competition was primarily regulated by consumer–substrate affinity rather than environmental factors such as soil temperature or soil moisture. Our results also imply that under strong nutrient limitation, relative competitiveness depends strongly on the competitor functional traits (affinity and nutrient carrier enzyme abundance). We then applied the N-COM model to analyze field nitrogen and phosphorus perturbation experiments in two tropical forest sites (in Hawaii and Puerto Rico) not used in model development or calibration. Under soil inorganic nitrogen and phosphorus elevated conditions, the model accurately replicated the experimentally observed competition among nutrient consumers. Although we used as many observations as we could obtain, more nutrient addition experiments in tropical systems would greatly benefit model testing and calibration. In summary, the N-COM model provides an ecologically consistent representation of nutrient competition appropriate for land BGC models integrated in Earth System Models.« less

  3. Multiple soil nutrient competition between plants, microbes, and mineral surfaces: model development, parameterization, and example applications in several tropical forests

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Riley, W. J.; Tang, J.; Koven, C. D.

    2016-01-01

    Soil is a complex system where biotic (e.g., plant roots, micro-organisms) and abiotic (e.g., mineral surfaces) consumers compete for resources necessary for life (e.g., nitrogen, phosphorus). This competition is ecologically significant, since it regulates the dynamics of soil nutrients and controls aboveground plant productivity. Here we develop, calibrate and test a nutrient competition model that accounts for multiple soil nutrients interacting with multiple biotic and abiotic consumers. As applied here for tropical forests, the Nutrient COMpetition model (N-COM) includes three primary soil nutrients (NH4+, NO3- and POx; representing the sum of PO43-, HPO42- and H2PO4-) and five potential competitors (plant roots, decomposing microbes, nitrifiers, denitrifiers and mineral surfaces). The competition is formulated with a quasi-steady-state chemical equilibrium approximation to account for substrate (multiple substrates share one consumer) and consumer (multiple consumers compete for one substrate) effects. N-COM successfully reproduced observed soil heterotrophic respiration, N2O emissions, free phosphorus, sorbed phosphorus and NH4+ pools at a tropical forest site (Tapajos). The overall model uncertainty was moderately well constrained. Our sensitivity analysis revealed that soil nutrient competition was primarily regulated by consumer-substrate affinity rather than environmental factors such as soil temperature or soil moisture. Our results also imply that under strong nutrient limitation, relative competitiveness depends strongly on the competitor functional traits (affinity and nutrient carrier enzyme abundance). We then applied the N-COM model to analyze field nitrogen and phosphorus perturbation experiments in two tropical forest sites (in Hawaii and Puerto Rico) not used in model development or calibration. Under soil inorganic nitrogen and phosphorus elevated conditions, the model accurately replicated the experimentally observed competition among nutrient consumers. Although we used as many observations as we could obtain, more nutrient addition experiments in tropical systems would greatly benefit model testing and calibration. In summary, the N-COM model provides an ecologically consistent representation of nutrient competition appropriate for land BGC models integrated in Earth System Models.

  4. Health Consequences of Racist and Antigay Discrimination for Multiple Minority Adolescents

    PubMed Central

    Thoma, Brian C.; Huebner, David M.

    2014-01-01

    Individuals who belong to a marginalized group and who perceive discrimination based on that group membership suffer from a variety of poor health outcomes. Many people belong to more than one marginalized group, and much less is known about the influence of multiple forms of discrimination on health outcomes. Drawing on literature describing the influence of multiple stressors, three models of combined forms of discrimination are discussed: additive, prominence, and exacerbation. The current study examined the influence of multiple forms of discrimination in a sample of African American lesbian, gay, or bisexual (LGB) adolescents ages 14–19. Each of the three models of combined stressors were tested to determine which best describes how racist and antigay discrimination combine to predict depressive symptoms, suicidal ideation, and substance use. Participants were included in this analysis if they identified their ethnicity as either African American (n = 156) or African American mixed (n = 120). Mean age was 17.45 years (SD = 1.36). Results revealed both forms of mistreatment were associated with depressive symptoms and suicidal ideation among African American LGB adolescents. Racism was more strongly associated with substance use. Future intervention efforts should be targeted toward reducing discrimination and improving the social context of multiple minority adolescents, and future research with multiple minority individuals should be attuned to the multiple forms of discrimination experienced by these individuals within their environments. PMID:23731232

  5. Prediction of hearing outcomes by multiple regression analysis in patients with idiopathic sudden sensorineural hearing loss.

    PubMed

    Suzuki, Hideaki; Tabata, Takahisa; Koizumi, Hiroki; Hohchi, Nobusuke; Takeuchi, Shoko; Kitamura, Takuro; Fujino, Yoshihisa; Ohbuchi, Toyoaki

    2014-12-01

    This study aimed to create a multiple regression model for predicting hearing outcomes of idiopathic sudden sensorineural hearing loss (ISSNHL). The participants were 205 consecutive patients (205 ears) with ISSNHL (hearing level ≥ 40 dB, interval between onset and treatment ≤ 30 days). They received systemic steroid administration combined with intratympanic steroid injection. Data were examined by simple and multiple regression analyses. Three hearing indices (percentage hearing improvement, hearing gain, and posttreatment hearing level [HLpost]) and 7 prognostic factors (age, days from onset to treatment, initial hearing level, initial hearing level at low frequencies, initial hearing level at high frequencies, presence of vertigo, and contralateral hearing level) were included in the multiple regression analysis as dependent and explanatory variables, respectively. In the simple regression analysis, the percentage hearing improvement, hearing gain, and HLpost showed significant correlation with 2, 5, and 6 of the 7 prognostic factors, respectively. The multiple correlation coefficients were 0.396, 0.503, and 0.714 for the percentage hearing improvement, hearing gain, and HLpost, respectively. Predicted values of HLpost calculated by the multiple regression equation were reliable with 70% probability with a 40-dB-width prediction interval. Prediction of HLpost by the multiple regression model may be useful to estimate the hearing prognosis of ISSNHL. © The Author(s) 2014.

  6. Construction of multiple linear regression models using blood biomarkers for selecting against abdominal fat traits in broilers.

    PubMed

    Dong, J Q; Zhang, X Y; Wang, S Z; Jiang, X F; Zhang, K; Ma, G W; Wu, M Q; Li, H; Zhang, H

    2018-01-01

    Plasma very low-density lipoprotein (VLDL) can be used to select for low body fat or abdominal fat (AF) in broilers, but its correlation with AF is limited. We investigated whether any other biochemical indicator can be used in combination with VLDL for a better selective effect. Nineteen plasma biochemical indicators were measured in male chickens from the Northeast Agricultural University broiler lines divergently selected for AF content (NEAUHLF) in the fed state at 46 and 48 d of age. The average concentration of every parameter for the 2 d was used for statistical analysis. Levels of these 19 plasma biochemical parameters were compared between the lean and fat lines. The phenotypic correlations between these plasma biochemical indicators and AF traits were analyzed. Then, multiple linear regression models were constructed to select the best model used for selecting against AF content. and the heritabilities of plasma indicators contained in the best models were estimated. The results showed that 11 plasma biochemical indicators (triglycerides, total bile acid, total protein, globulin, albumin/globulin, aspartate transaminase, alanine transaminase, gamma-glutamyl transpeptidase, uric acid, creatinine, and VLDL) differed significantly between the lean and fat lines (P < 0.01), and correlated significantly with AF traits (P < 0.05). The best multiple linear regression models based on albumin/globulin, VLDL, triglycerides, globulin, total bile acid, and uric acid, had higher R2 (0.73) than the model based only on VLDL (0.21). The plasma parameters included in the best models had moderate heritability estimates (0.21 ≤ h2 ≤ 0.43). These results indicate that these multiple linear regression models can be used to select for lean broiler chickens. © 2017 Poultry Science Association Inc.

  7. Tumor growth affects the metabonomic phenotypes of multiple mouse non-involved organs in an A549 lung cancer xenograft model.

    PubMed

    Xu, Shan; Tian, Yuan; Hu, Yili; Zhang, Nijia; Hu, Sheng; Song, Dandan; Wu, Zhengshun; Wang, Yulan; Cui, Yanfang; Tang, Huiru

    2016-06-22

    The effects of tumorigenesis and tumor growth on the non-involved organs remain poorly understood although many research efforts have already been made for understanding the metabolic phenotypes of various tumors. To better the situation, we systematically analyzed the metabolic phenotypes of multiple non-involved mouse organ tissues (heart, liver, spleen, lung and kidney) in an A549 lung cancer xenograft model at two different tumor-growth stages using the NMR-based metabonomics approaches. We found that tumor growth caused significant metabonomic changes in multiple non-involved organ tissues involving numerous metabolic pathways, including glycolysis, TCA cycle and metabolisms of amino acids, fatty acids, choline and nucleic acids. Amongst these, the common effects are enhanced glycolysis and nucleoside/nucleotide metabolisms. These findings provided essential biochemistry information about the effects of tumor growth on the non-involved organs.

  8. Multiple role adaptation among women who have children and re-enter nursing school in Taiwan.

    PubMed

    Lin, Li-Ling

    2005-03-01

    This study assessed multiple role adaptation within maternal and student roles among female RNs who had children and returned to school for baccalaureate degrees in Taiwan. Using Roy's Adaptation Model as the theoretical framework, relationships were explored among demographic (number of children, age of youngest child, employment status), physical (sleep quality, health perception, activity), and psychosocial factors (self-identity, role expectation, role involvement, social support) and multiple role adaptation (role accumulation). The sample included 118 mother-students who had at least one child younger than age 18 and who were studying in nursing programs in Taiwan. The highest correlation was found between activity and role accumulation followed by significant correlations between sleep quality, health perception, maternal role expectation, and age of youngest child and role accumulation. In regression analyses, the complete model explained 46% of the variance in role accumulation. Implications for education and future research are identified.

  9. Search for supersymmetry in the multijet and missing transverse momentum channel in pp collisions at 13 TeV: Z + jets background

    NASA Astrophysics Data System (ADS)

    Mulholland, Troy; CMS Collaboration

    2016-03-01

    We present a search for supersymmetry (SUSY) with data collected from the Compact Muon Solenoid (CMS) detector. The sample corresponds to 2 . 3fb-1 of proton-proton collisions with √{ s} = 13 TeV delivered by the Large Hadron Collider (LHC). The search looks at events with large hadronic activity, missing transverse energy, and without any identified leptons. The data are analyzed in bins of jet multiplicity, bottom-quark tagged jet (b-jet) multiplicity, scalar sum of jet transverse momentum, and vector sum of jet transverse momentum. A standard model (SM) background to this search includes the SM production of multiple jets and a Z boson that decays to two undetectable neutrinos. This talk focuses on the measurement of this particular background and its context in the wider search. Observations are consistent with SM backgrounds and limits are set on gluino mediated simplified SUSY models.

  10. Multiple incidence angle SIR-B experiment over Argentina

    NASA Technical Reports Server (NTRS)

    Cimino, Jobea; Casey, Daren; Wall, Stephen; Brandani, Aldo; Domik, Gitta; Leberl, Franz

    1986-01-01

    The Shuttle Imaging Radar (SIR-B), the second synthetic aperture radar (SAR) to fly aboard a shuttle, was launched on October 5, 1984. One of the primary goals of the SIR-B experiment was to use multiple incidence angle radar images to distinguish different terrain types through the use of their characteristic backscatter curves. This goal was accomplished in several locations including the Chubut Province of southern Argentina. Four descending image acquisitions were collected providing a multiple incidence angle image set. The data were first used to assess stereo-radargrammetric techniques. A digital elevation model was produced using the optimum pair of multiple incidence angle images. This model was then used to determine the local incidence angle of each picture element to generate curves of relative brightness vs. incidence angle. Secondary image products were also generated using the multi-angle data. The results of this work indicate that: (1) various forest species and various structures of a single species may be discriminated using multiple incidence angle radar imagery, and (2) it is essential to consider the variation in backscatter due to a variable incidence angle when analyzing and comparing data collected at varying frequencies and polarizations.

  11. Secular dynamics of hierarchical multiple systems composed of nested binaries, with an arbitrary number of bodies and arbitrary hierarchical structure - II. External perturbations: flybys and supernovae

    NASA Astrophysics Data System (ADS)

    Hamers, Adrian S.

    2018-05-01

    We extend the formalism of a previous paper to include the effects of flybys and instantaneous perturbations such as supernovae on the long-term secular evolution of hierarchical multiple systems with an arbitrary number of bodies and hierarchy, provided that the system is composed of nested binary orbits. To model secular encounters, we expand the Hamiltonian in terms of the ratio of the separation of the perturber with respect to the barycentre of the multiple system, to the separation of the widest orbit. Subsequently, we integrate over the perturber orbit numerically or analytically. We verify our method for secular encounters and illustrate it with an example. Furthermore, we describe a method to compute instantaneous orbital changes to multiple systems, such as asymmetric supernovae and impulsive encounters. The secular code, with implementation of the extensions described in this paper, is publicly available within AMUSE, and we provide a number of simple example scripts to illustrate its usage for secular and impulsive encounters and asymmetric supernovae. The extensions presented in this paper are a next step towards efficiently modelling the evolution of complex multiple systems embedded in star clusters.

  12. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  13. Features in visual search combine linearly

    PubMed Central

    Pramod, R. T.; Arun, S. P.

    2014-01-01

    Single features such as line orientation and length are known to guide visual search, but relatively little is known about how multiple features combine in search. To address this question, we investigated how search for targets differing in multiple features (intensity, length, orientation) from the distracters is related to searches for targets differing in each of the individual features. We tested race models (based on reaction times) and co-activation models (based on reciprocal of reaction times) for their ability to predict multiple feature searches. Multiple feature searches were best accounted for by a co-activation model in which feature information combined linearly (r = 0.95). This result agrees with the classic finding that these features are separable i.e., subjective dissimilarity ratings sum linearly. We then replicated the classical finding that the length and width of a rectangle are integral features—in other words, they combine nonlinearly in visual search. However, to our surprise, upon including aspect ratio as an additional feature, length and width combined linearly and this model outperformed all other models. Thus, length and width of a rectangle became separable when considered together with aspect ratio. This finding predicts that searches involving shapes with identical aspect ratio should be more difficult than searches where shapes differ in aspect ratio. We confirmed this prediction on a variety of shapes. We conclude that features in visual search co-activate linearly and demonstrate for the first time that aspect ratio is a novel feature that guides visual search. PMID:24715328

  14. Application of multiple modelling to hyperthermia estimation: reducing the effects of model mismatch.

    PubMed

    Potocki, J K; Tharp, H S

    1993-01-01

    Multiple model estimation is a viable technique for dealing with the spatial perfusion model mismatch associated with hyperthermia dosimetry. Using multiple models, spatial discrimination can be obtained without increasing the number of unknown perfusion zones. Two multiple model estimators based on the extended Kalman filter (EKF) are designed and compared with two EKFs based on single models having greater perfusion zone segmentation. Results given here indicate that multiple modelling is advantageous when the number of thermal sensors is insufficient for convergence of single model estimators having greater perfusion zone segmentation. In situations where sufficient measured outputs exist for greater unknown perfusion parameter estimation, the multiple model estimators and the single model estimators yield equivalent results.

  15. DIANA: A multi-phase, multi-component hydrodynamic model for the analysis of severe accidents in heavy water reactors with multiple-tube assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tentner, A.M.

    1994-03-01

    A detailed hydrodynamic fuel relocation model has been developed for the analysis of severe accidents in Heavy Water Reactors with multiple-tube Assemblies. This model describes the Fuel Disruption and Relocation inside a nuclear fuel assembly and is designated by the acronym DIANA. DIANA solves the transient hydrodynamic equations for all the moving materials in the core and treats all the relevant flow regimes. The numerical solution techniques and some of the physical models included in DIANA have been developed taking advantage of the extensive experience accumulated in the development and validation of the LEVITATE (1) fuel relocation model of SAS4Amore » [2, 3]. The model is designed to handle the fuel and cladding relocation in both voided and partially voided channels. It is able to treat a wide range of thermal/ hydraulic/neutronic conditions and the presence of various flow regimes at different axial locations within the same hydrodynamic channel.« less

  16. Modeling the long-term effect of winter cover crops on nitrate transport in artificially drained fields across the Midwest U.S.

    USDA-ARS?s Scientific Manuscript database

    A fall-planted cover crop is a management practice with multiple benefits including reducing nitrate losses from artificially drained fields. We used the Root Zone Water Quality Model (RZWQM) to simulate the impact of a cereal rye cover crop on reducing nitrate losses from drained fields across five...

  17. Measuring the Performance and Intelligence of Systems: Proceedings of the 2002 PerMIS Workshop

    NASA Technical Reports Server (NTRS)

    Messina, E. R.; Meystel, A. M.

    2002-01-01

    Contents include the following: Performance Metrics; Performance of Multiple Agents; Performance of Mobility Systems; Performance of Planning Systems; General Discussion Panel 1; Uncertainty of Representation I; Performance of Robots in Hazardous Domains; Modeling Intelligence; Modeling of Mind; Measuring Intelligence; Grouping: A Core Procedure of Intelligence; Uncertainty in Representation II; Towards Universal Planning/Control Systems.

  18. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    ERIC Educational Resources Information Center

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  19. Promoting Mental Health and Preventing Substance Abuse and Violence in Elementary Students: A Randomized Control Study of the Michigan Model for Health

    ERIC Educational Resources Information Center

    O'Neill, James M.; Clark, Jeffrey K.; Jones, James A.

    2011-01-01

    Background: In elementary grades, comprehensive health education curricula mostly have demonstrated effectiveness in addressing singular health issues. The Michigan Model for Health (MMH) was implemented and evaluated to determine its impact on multiple health issues, including social and emotional skills, prosocial behavior, and drug use and…

  20. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The

  1. Probabilistic-Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    developed to determine the relative importance of structural components of the vehicle under differnet crash and blast scenarios. With the integration of...the vehicle under different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the...parameter variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment

  2. Assessing Women's Preferences and Preference Modeling for Breast Reconstruction Decision-Making.

    PubMed

    Sun, Clement S; Cantor, Scott B; Reece, Gregory P; Crosby, Melissa A; Fingeret, Michelle C; Markey, Mia K

    2014-03-01

    Women considering breast reconstruction must make challenging trade-offs amongst issues that often conflict. It may be useful to quantify possible outcomes using a single summary measure to aid a breast cancer patient in choosing a form of breast reconstruction. In this study, we used multiattribute utility theory to combine multiple objectives to yield a summary value using nine different preference models. We elicited the preferences of 36 women, aged 32 or older with no history of breast cancer, for the patient-reported outcome measures of breast satisfaction, psychosocial well-being, chest well-being, abdominal well-being, and sexual wellbeing as measured by the BREAST-Q in addition to time lost to reconstruction and out-of-pocket cost. Participants ranked hypothetical breast reconstruction outcomes. We examined each multiattribute utility preference model and assessed how often each model agreed with participants' rankings. The median amount of time required to assess preferences was 34 minutes. Agreement among the nine preference models with the participants ranged from 75.9% to 78.9%. None of the preference models performed significantly worse than the best performing risk averse multiplicative model. We hypothesize an average theoretical agreement of 94.6% for this model if participant error is included. There was a statistically significant positive correlation with more unequal distribution of weight given to the seven attributes. We recommend the risk averse multiplicative model for modeling the preferences of patients considering different forms of breast reconstruction because it agreed most often with the participants in this study.

  3. Mapping habitat for multiple species in the Desert Southwest

    USGS Publications Warehouse

    Inman, Richard D.; Nussear, Kenneth E.; Esque, Todd C.; Vandergast, Amy G.; Hathaway, Stacie A.; Wood, Dustin A.; Barr, Kelly R.; Fisher, Robert N.

    2014-01-01

    Many utility scale renewable energy projects are currently proposed across the Mojave Ecoregion. Agencies that manage biological resources throughout this region need to understand the potential impacts of these renewable energy projects and their associated infrastructure (for example, transmission corridors, substations, access roads, etc.) on species movement, genetic exchange among populations, and species’ abilities to adapt to changing environmental conditions. Understanding these factors will help managers’ select appropriate project sites and possibly mitigate for anticipated effects of management activities. We used species distribution models to map habitat for 15 species across the Mojave Ecoregion to aid regional land-use management planning. Models were developed using a common 1 × 1 kilometer resolution with maximum entropy and generalized additive models. Occurrence data were compiled from multiple sources, including VertNet (http://vertnet.org/), HerpNET (http://www.herpnet.org), and MaNIS (http://manisnet.org), as well as from internal U.S. Geological Survey databases and other biologists. Background data included 20 environmental covariates representing terrain, vegetation, and climate covariates. This report summarizes these environmental covariates and species distribution models used to predict habitat for the 15 species across the Mojave Ecoregion.

  4. Performance Analysis and Scaling Behavior of the Terrestrial Systems Modeling Platform TerrSysMP in Large-Scale Supercomputing Environments

    NASA Astrophysics Data System (ADS)

    Kollet, S. J.; Goergen, K.; Gasper, F.; Shresta, P.; Sulis, M.; Rihani, J.; Simmer, C.; Vereecken, H.

    2013-12-01

    In studies of the terrestrial hydrologic, energy and biogeochemical cycles, integrated multi-physics simulation platforms take a central role in characterizing non-linear interactions, variances and uncertainties of system states and fluxes in reciprocity with observations. Recently developed integrated simulation platforms attempt to honor the complexity of the terrestrial system across multiple time and space scales from the deeper subsurface including groundwater dynamics into the atmosphere. Technically, this requires the coupling of atmospheric, land surface, and subsurface-surface flow models in supercomputing environments, while ensuring a high-degree of efficiency in the utilization of e.g., standard Linux clusters and massively parallel resources. A systematic performance analysis including profiling and tracing in such an application is crucial in the understanding of the runtime behavior, to identify optimum model settings, and is an efficient way to distinguish potential parallel deficiencies. On sophisticated leadership-class supercomputers, such as the 28-rack 5.9 petaFLOP IBM Blue Gene/Q 'JUQUEEN' of the Jülich Supercomputing Centre (JSC), this is a challenging task, but even more so important, when complex coupled component models are to be analysed. Here we want to present our experience from coupling, application tuning (e.g. 5-times speedup through compiler optimizations), parallel scaling and performance monitoring of the parallel Terrestrial Systems Modeling Platform TerrSysMP. The modeling platform consists of the weather prediction system COSMO of the German Weather Service; the Community Land Model, CLM of NCAR; and the variably saturated surface-subsurface flow code ParFlow. The model system relies on the Multiple Program Multiple Data (MPMD) execution model where the external Ocean-Atmosphere-Sea-Ice-Soil coupler (OASIS3) links the component models. TerrSysMP has been instrumented with the performance analysis tool Scalasca and analyzed on JUQUEEN with processor counts on the order of 10,000. The instrumentation is used in weak and strong scaling studies with real data cases and hypothetical idealized numerical experiments for detailed profiling and tracing analysis. The profiling is not only useful in identifying wait states that are due to the MPMD execution model, but also in fine-tuning resource allocation to the component models in search of the most suitable load balancing. This is especially necessary, as with numerical experiments that cover multiple (high resolution) spatial scales, the time stepping, coupling frequencies, and communication overheads are constantly shifting, which makes it necessary to re-determine the model setup with each new experimental design.

  5. Software engineering the mixed model for genome-wide association studies on large samples.

    PubMed

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  6. Adaptation to Climate Change: A Comparative Analysis of Modeling Methods for Heat-Related Mortality.

    PubMed

    Gosling, Simon N; Hondula, David M; Bunker, Aditi; Ibarreta, Dolores; Liu, Junguo; Zhang, Xinxin; Sauerborn, Rainer

    2017-08-16

    Multiple methods are employed for modeling adaptation when projecting the impact of climate change on heat-related mortality. The sensitivity of impacts to each is unknown because they have never been systematically compared. In addition, little is known about the relative sensitivity of impacts to "adaptation uncertainty" (i.e., the inclusion/exclusion of adaptation modeling) relative to using multiple climate models and emissions scenarios. This study had three aims: a ) Compare the range in projected impacts that arises from using different adaptation modeling methods; b ) compare the range in impacts that arises from adaptation uncertainty with ranges from using multiple climate models and emissions scenarios; c ) recommend modeling method(s) to use in future impact assessments. We estimated impacts for 2070-2099 for 14 European cities, applying six different methods for modeling adaptation; we also estimated impacts with five climate models run under two emissions scenarios to explore the relative effects of climate modeling and emissions uncertainty. The range of the difference (percent) in impacts between including and excluding adaptation, irrespective of climate modeling and emissions uncertainty, can be as low as 28% with one method and up to 103% with another (mean across 14 cities). In 13 of 14 cities, the ranges in projected impacts due to adaptation uncertainty are larger than those associated with climate modeling and emissions uncertainty. Researchers should carefully consider how to model adaptation because it is a source of uncertainty that can be greater than the uncertainty in emissions and climate modeling. We recommend absolute threshold shifts and reductions in slope. https://doi.org/10.1289/EHP634.

  7. Multiple Sclerosis and Catastrophic Health Expenditure in Iran.

    PubMed

    Juyani, Yaser; Hamedi, Dorsa; Hosseini Jebeli, Seyede Sedighe; Qasham, Maryam

    2016-09-01

    There are many disabling medical conditions which can result in catastrophic health expenditure. Multiple Sclerosis is one of the most costly medical conditions through the world which encounter families to the catastrophic health expenditures. This study aims to investigate on what extent Multiple sclerosis patients face catastrophic costs. This study was carried out in Ahvaz, Iran (2014). The study population included households that at least one of their members suffers from MS. To analyze data, Logit regression model was employed by using the default software STATA12. 3.37% of families were encountered with catastrophic costs. Important variables including brand of drug, housing, income and health insurance were significantly correlated with catastrophic expenditure. This study suggests that although a small proportion of MS patients met the catastrophic health expenditure, mechanisms that pool risk and cost (e.g. health insurance) are required to protect them and improve financial and access equity in health care.

  8. Phased-array-fed antenna configuration study, volume 2

    NASA Technical Reports Server (NTRS)

    Sorbello, R. M.; Zaghloul, A. I.; Lee, B. S.; Siddiqi, S.; Geller, B. D.

    1983-01-01

    Increased capacity in future satellite systems can be achieved through antenna systems which provide multiplicity of frequency reuses at K sub a band. A number of antenna configurations which can provide multiple fixed spot beams and multiple independent spot scanning beams at 20 GHz are addressed. Each design incorporates a phased array with distributed MMIC amplifiers and phasesifters feeding a two reflector optical system. The tradeoffs required for the design of these systems and the corresponding performances are presented. Five final designs are studied. In so doing, a type of MMIC/waveguide transition is described, and measured results of the breadboard model are presented. Other hardware components developed are described. This includes a square orthomode transducer, a subarray fed with a beamforming network to measure scanning performance, and another subarray used to study mutual coupling considerations. Discussions of the advantages and disadvantages of the final design are included.

  9. Global Climate Change Adaptation Priorities for Biodiversity and Food Security

    PubMed Central

    Hannah, Lee; Ikegami, Makihiko; Hole, David G.; Seo, Changwan; Butchart, Stuart H. M.; Peterson, A. Townsend; Roehrdanz, Patrick R.

    2013-01-01

    International policy is placing increasing emphasis on adaptation to climate change, including the allocation of new funds to assist adaptation efforts. Climate change adaptation funding may be most effective where it meets integrated goals, but global geographic priorities based on multiple development and ecological criteria are not well characterized. Here we show that human and natural adaptation needs related to maintaining agricultural productivity and ecosystem integrity intersect in ten major areas globally, providing a coherent set of international priorities for adaptation funding. An additional seven regional areas are identified as worthy of additional study. The priority areas are locations where changes in crop suitability affecting impoverished farmers intersect with changes in ranges of restricted-range species. Agreement among multiple climate models and emissions scenarios suggests that these priorities are robust. Adaptation funding directed to these areas could simultaneously address multiple international policy goals, including poverty reduction, protecting agricultural production and safeguarding ecosystem services. PMID:23991125

  10. Global climate change adaptation priorities for biodiversity and food security.

    PubMed

    Hannah, Lee; Ikegami, Makihiko; Hole, David G; Seo, Changwan; Butchart, Stuart H M; Peterson, A Townsend; Roehrdanz, Patrick R

    2013-01-01

    International policy is placing increasing emphasis on adaptation to climate change, including the allocation of new funds to assist adaptation efforts. Climate change adaptation funding may be most effective where it meets integrated goals, but global geographic priorities based on multiple development and ecological criteria are not well characterized. Here we show that human and natural adaptation needs related to maintaining agricultural productivity and ecosystem integrity intersect in ten major areas globally, providing a coherent set of international priorities for adaptation funding. An additional seven regional areas are identified as worthy of additional study. The priority areas are locations where changes in crop suitability affecting impoverished farmers intersect with changes in ranges of restricted-range species. Agreement among multiple climate models and emissions scenarios suggests that these priorities are robust. Adaptation funding directed to these areas could simultaneously address multiple international policy goals, including poverty reduction, protecting agricultural production and safeguarding ecosystem services.

  11. Global Sensitivity Analysis of Environmental Systems via Multiple Indices based on Statistical Moments of Model Outputs

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Dell'Oca, A.

    2017-12-01

    We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.

  12. Inferring Binary and Trinary Stellar Populations in Photometric and Astrometric Surveys

    NASA Astrophysics Data System (ADS)

    Widmark, Axel; Leistedt, Boris; Hogg, David W.

    2018-04-01

    Multiple stellar systems are ubiquitous in the Milky Way but are often unresolved and seen as single objects in spectroscopic, photometric, and astrometric surveys. However, modeling them is essential for developing a full understanding of large surveys such as Gaia and connecting them to stellar and Galactic models. In this paper, we address this problem by jointly fitting the Gaia and Two Micron All Sky Survey photometric and astrometric data using a data-driven Bayesian hierarchical model that includes populations of binary and trinary systems. This allows us to classify observations into singles, binaries, and trinaries, in a robust and efficient manner, without resorting to external models. We are able to identify multiple systems and, in some cases, make strong predictions for the properties of their unresolved stars. We will be able to compare such predictions with Gaia Data Release 4, which will contain astrometric identification and analysis of binary systems.

  13. Meteorological adjustment of yearly mean values for air pollutant concentration comparison

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.; Neustadter, H. E.

    1976-01-01

    Using multiple linear regression analysis, models which estimate mean concentrations of Total Suspended Particulate (TSP), sulfur dioxide, and nitrogen dioxide as a function of several meteorologic variables, two rough economic indicators, and a simple trend in time are studied. Meteorologic data were obtained and do not include inversion heights. The goodness of fit of the estimated models is partially reflected by the squared coefficient of multiple correlation which indicates that, at the various sampling stations, the models accounted for about 23 to 47 percent of the total variance of the observed TSP concentrations. If the resulting model equations are used in place of simple overall means of the observed concentrations, there is about a 20 percent improvement in either: (1) predicting mean concentrations for specified meteorological conditions; or (2) adjusting successive yearly averages to allow for comparisons devoid of meteorological effects. An application to source identification is presented using regression coefficients of wind velocity predictor variables.

  14. Water Isotopes in the GISS GCM: History, Applications and Potential

    NASA Astrophysics Data System (ADS)

    Schmidt, G. A.; LeGrande, A. N.; Field, R. D.; Nusbaumer, J. M.

    2017-12-01

    Water isotopes have been incorporated in the GISS GCMs since the pioneering work of Jean Jouzel in the 1980s. Since 2005, this functionality has been maintained within the master branch of the development code and has been usable (and used) in all subsequent versions. This has allowed a wide variety of applications, across multiple time-scales and interests, to be tackled coherently. Water isotope tracers have been used to debug the atmospheric model code, tune parameterisations of moist processes, assess the isotopic fingerprints of multiple climate drivers, produce forward models for remotely sensed isotope products, and validate paleo-climate interpretations from the last millennium to the Eocene. We will present an overview of recent results involving isotope tracers, including improvements in models for the isotopic fractionation processes themselves, and demonstrate the potential for using these tracers and models more systematically in paleo-climate reconstructions and investigations of the modern hydrological cycle.

  15. Maximum Marginal Likelihood Estimation of a Monotonic Polynomial Generalized Partial Credit Model with Applications to Multiple Group Analysis.

    PubMed

    Falk, Carl F; Cai, Li

    2016-06-01

    We present a semi-parametric approach to estimating item response functions (IRF) useful when the true IRF does not strictly follow commonly used functions. Our approach replaces the linear predictor of the generalized partial credit model with a monotonic polynomial. The model includes the regular generalized partial credit model at the lowest order polynomial. Our approach extends Liang's (A semi-parametric approach to estimate IRFs, Unpublished doctoral dissertation, 2007) method for dichotomous item responses to the case of polytomous data. Furthermore, item parameter estimation is implemented with maximum marginal likelihood using the Bock-Aitkin EM algorithm, thereby facilitating multiple group analyses useful in operational settings. Our approach is demonstrated on both educational and psychological data. We present simulation results comparing our approach to more standard IRF estimation approaches and other non-parametric and semi-parametric alternatives.

  16. Galantamine improves olfactory learning in the Ts65Dn mouse model of Down syndrome

    PubMed Central

    Simoes de Souza, Fabio M.; Busquet, Nicolas; Blatner, Megan; Maclean, Kenneth N.; Restrepo, Diego

    2011-01-01

    Down syndrome (DS) is the most common form of congenital intellectual disability. Although DS involves multiple disturbances in various tissues, there is little doubt that in terms of quality of life cognitive impairment is the most serious facet and there is no effective treatment for this aspect of the syndrome. The Ts65Dn mouse model of DS recapitulates multiple aspects of DS including cognitive impairment. Here the Ts65Dn mouse model of DS was evaluated in an associative learning paradigm based on olfactory cues. In contrast to disomic controls, trisomic mice exhibited significant deficits in olfactory learning. Treatment of trisomic mice with the acetylcholinesterase inhibitor galantamine resulted in a significant improvement in olfactory learning. Collectively, our study indicates that olfactory learning can be a sensitive tool for evaluating deficits in associative learning in mouse models of DS and that galantamine has therapeutic potential for improving cognitive abilities. PMID:22355654

  17. Galantamine improves olfactory learning in the Ts65Dn mouse model of Down syndrome.

    PubMed

    de Souza, Fabio M Simoes; Busquet, Nicolas; Blatner, Megan; Maclean, Kenneth N; Restrepo, Diego

    2011-01-01

    Down syndrome (DS) is the most common form of congenital intellectual disability. Although DS involves multiple disturbances in various tissues, there is little doubt that in terms of quality of life cognitive impairment is the most serious facet and there is no effective treatment for this aspect of the syndrome. The Ts65Dn mouse model of DS recapitulates multiple aspects of DS including cognitive impairment. Here the Ts65Dn mouse model of DS was evaluated in an associative learning paradigm based on olfactory cues. In contrast to disomic controls, trisomic mice exhibited significant deficits in olfactory learning. Treatment of trisomic mice with the acetylcholinesterase inhibitor galantamine resulted in a significant improvement in olfactory learning. Collectively, our study indicates that olfactory learning can be a sensitive tool for evaluating deficits in associative learning in mouse models of DS and that galantamine has therapeutic potential for improving cognitive abilities.

  18. Muscular dystrophy summer camp: a case study of a non-traditional level I fieldwork using a collaborative supervision model.

    PubMed

    Provident, Ingrid M; Colmer, Maria A

    2013-01-01

    A shortage of traditional medical fieldwork placements has been reported in the United States. Alternative settings are being sought to meet the Accreditation Standards for Level I fieldwork. This study was designed to examine and report the outcomes of an alternative pediatric camp setting, using a group model of supervision to fulfill the requirements for Level I fieldwork. Thirty-seven students from two Pennsylvania OT schools. Two cohorts of students were studied over a two year period using multiple methods of retrospective review and data collection. Students supervised in a group model experienced positive outcomes, including opportunities to deliver client centered care, and understanding the role of caregiving for children with disabilities. The use of a collaborative model of fieldwork education at a camp setting has resulted in a viable approach for the successful attainment of Level I fieldwork objectives for multiple students under a single supervisor.

  19. Analysis of dense-medium light scattering with applications to corneal tissue: experiments and Monte Carlo simulations.

    PubMed

    Kim, K B; Shanyfelt, L M; Hahn, D W

    2006-01-01

    Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.

  20. Argentina soybean yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    A model based on multiple regression was developed to estimate soybean yields for the country of Argentina. A meteorological data set was obtained for the country by averaging data for stations within the soybean growing area. Predictor variables for the model were derived from monthly total precipitation and monthly average temperature. A trend variable was included for the years 1969 to 1978 since an increasing trend in yields due to technology was observed between these years.

  1. Triangular arbitrage as an interaction among foreign exchange rates

    NASA Astrophysics Data System (ADS)

    Aiba, Yukihiro; Hatano, Naomichi; Takayasu, Hideki; Marumo, Kouhei; Shimizu, Tokiko

    2002-07-01

    We first show that there are in fact triangular arbitrage opportunities in the spot foreign exchange markets, analyzing the time dependence of the yen-dollar rate, the dollar-euro rate and the yen-euro rate. Next, we propose a model of foreign exchange rates with an interaction. The model includes effects of triangular arbitrage transactions as an interaction among three rates. The model explains the actual data of the multiple foreign exchange rates well.

  2. Extending data worth methods to select multiple observations targeting specific hydrological predictions of interest

    NASA Astrophysics Data System (ADS)

    Vilhelmsen, Troels N.; Ferré, Ty P. A.

    2016-04-01

    Hydrological models are often developed to forecasting future behavior in response due to natural or human induced changes in stresses affecting hydrologic systems. Commonly, these models are conceptualized and calibrated based on existing data/information about the hydrological conditions. However, most hydrologic systems lack sufficient data to constrain models with adequate certainty to support robust decision making. Therefore, a key element of a hydrologic study is the selection of additional data to improve model performance. Given the nature of hydrologic investigations, it is not practical to select data sequentially, i.e. to choose the next observation, collect it, refine the model, and then repeat the process. Rather, for timing and financial reasons, measurement campaigns include multiple wells or sampling points. There is a growing body of literature aimed at defining the expected data worth based on existing models. However, these are almost all limited to identifying single additional observations. In this study, we present a methodology for simultaneously selecting multiple potential new observations based on their expected ability to reduce the uncertainty of the forecasts of interest. This methodology is based on linear estimates of the predictive uncertainty, and it can be used to determine the optimal combinations of measurements (location and number) established to reduce the uncertainty of multiple predictions. The outcome of the analysis is an estimate of the optimal sampling locations; the optimal number of samples; as well as a probability map showing the locations within the investigated area that are most likely to provide useful information about the forecasting of interest.

  3. Double-observer line transect surveys with Markov-modulated Poisson process models for animal availability.

    PubMed

    Borchers, D L; Langrock, R

    2015-12-01

    We develop maximum likelihood methods for line transect surveys in which animals go undetected at distance zero, either because they are stochastically unavailable while within view or because they are missed when they are available. These incorporate a Markov-modulated Poisson process model for animal availability, allowing more clustered availability events than is possible with Poisson availability models. They include a mark-recapture component arising from the independent-observer survey, leading to more accurate estimation of detection probability given availability. We develop models for situations in which (a) multiple detections of the same individual are possible and (b) some or all of the availability process parameters are estimated from the line transect survey itself, rather than from independent data. We investigate estimator performance by simulation, and compare the multiple-detection estimators with estimators that use only initial detections of individuals, and with a single-observer estimator. Simultaneous estimation of detection function parameters and availability model parameters is shown to be feasible from the line transect survey alone with multiple detections and double-observer data but not with single-observer data. Recording multiple detections of individuals improves estimator precision substantially when estimating the availability model parameters from survey data, and we recommend that these data be gathered. We apply the methods to estimate detection probability from a double-observer survey of North Atlantic minke whales, and find that double-observer data greatly improve estimator precision here too. © 2015 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  4. Methods, computer readable media, and graphical user interfaces for analysis of frequency selective surfaces

    DOEpatents

    Kotter, Dale K [Shelley, ID; Rohrbaugh, David T [Idaho Falls, ID

    2010-09-07

    A frequency selective surface (FSS) and associated methods for modeling, analyzing and designing the FSS are disclosed. The FSS includes a pattern of conductive material formed on a substrate to form an array of resonance elements. At least one aspect of the frequency selective surface is determined by defining a frequency range including multiple frequency values, determining a frequency dependent permittivity across the frequency range for the substrate, determining a frequency dependent conductivity across the frequency range for the conductive material, and analyzing the frequency selective surface using a method of moments analysis at each of the multiple frequency values for an incident electromagnetic energy impinging on the frequency selective surface. The frequency dependent permittivity and the frequency dependent conductivity are included in the method of moments analysis.

  5. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  6. ALTERNATIVES TO HELIUM-3 FOR NEUTRON MULTIPLICITY DETECTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.

    Collaboration between the Pacific Northwest National Laboratory (PNNL) and the Los Alamos National Laboratory (LANL) is underway to evaluate neutron detection technologies that might replace the high-pressure helium (3He) tubes currently used in neutron multiplicity counter for safeguards applications. The current stockpile of 3He is diminishing and alternatives are needed for a variety of neutron detection applications including multiplicity counters. The first phase of this investigation uses a series of Monte Carlo calculations to simulate the performance of an existing neutron multiplicity counter configuration by replacing the 3He tubes in a model for that counter with candidate alternative technologies. Thesemore » alternative technologies are initially placed in approximately the same configuration as the 3He tubes to establish a reference level of performance against the 3He-based system. After these reference-level results are established, the configurations of the alternative models will be further modified for performance optimization. The 3He model for these simulations is the one used by LANL to develop and benchmark the Epithermal Neutron Multiplicity Counter (ENMC) detector, as documented by H.O. Menlove, et al. in the 2004 LANL report LA-14088. The alternative technologies being evaluated are the boron-tri-fluoride-filled proportional tubes, boron-lined tubes, and lithium coated materials previously tested as possible replacements in portal monitor screening applications, as documented by R.T. Kouzes, et al. in the 2010 PNNL report PNNL-72544 and NIM A 623 (2010) 1035–1045. The models and methods used for these comparative calculations will be described and preliminary results shown« less

  7. Exciton effects in the index of refraction of multiple quantum wells and superlattices

    NASA Technical Reports Server (NTRS)

    Kahen, K. B.; Leburton, J. P.

    1986-01-01

    Theoretical calculations of the index of refraction of multiple quantum wells and superlattices are presented. The model incorporates both the bound and continuum exciton contributions for the gamma region transitions. In addition, the electronic band structure model has both superlattice and bulk alloy properties. The results indicate that large light-hole masses, i.e., of about 0.23, produced by band mixing effects, are required to account for the experimental data. Furthermore, it is shown that superlattice effects rapidly decrease for energies greater than the confining potential barriers. Overall, the theoretical results are in very good agreement with the experimental data and show the importance of including exciton effects in the index of refraction.

  8. Multiagent Work Practice Simulation: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)

    2001-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  9. Multiagent Work Practice Simulation: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten

    2002-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  10. Understanding and quantifying foliar temperature acclimation for Earth System Models

    NASA Astrophysics Data System (ADS)

    Smith, N. G.; Dukes, J.

    2015-12-01

    Photosynthesis and respiration on land are the two largest carbon fluxes between the atmosphere and Earth's surface. The parameterization of these processes represent major uncertainties in the terrestrial component of the Earth System Models used to project future climate change. Research has shown that much of this uncertainty is due to the parameterization of the temperature responses of leaf photosynthesis and autotrophic respiration, which are typically based on short-term empirical responses. Here, we show that including longer-term responses to temperature, such as temperature acclimation, can help to reduce this uncertainty and improve model performance, leading to drastic changes in future land-atmosphere carbon feedbacks across multiple models. However, these acclimation formulations have many flaws, including an underrepresentation of many important global flora. In addition, these parameterizations were done using multiple studies that employed differing methodology. As such, we used a consistent methodology to quantify the short- and long-term temperature responses of maximum Rubisco carboxylation (Vcmax), maximum rate of Ribulos-1,5-bisphosphate regeneration (Jmax), and dark respiration (Rd) in multiple species representing each of the plant functional types used in global-scale land surface models. Short-term temperature responses of each process were measured in individuals acclimated for 7 days at one of 5 temperatures (15-35°C). The comparison of short-term curves in plants acclimated to different temperatures were used to evaluate long-term responses. Our analyses indicated that the instantaneous response of each parameter was highly sensitive to the temperature at which they were acclimated. However, we found that this sensitivity was larger in species whose leaves typically experience a greater range of temperatures over the course of their lifespan. These data indicate that models using previous acclimation formulations are likely incorrectly simulating leaf carbon exchange responses to future warming. Therefore, our data, if used to parameterize large-scale models, are likely to provide an even greater improvement in model performance, resulting in more reliable projections of future carbon-clime feedbacks.

  11. Multiple piezo-patch energy harvesters integrated to a thin plate with AC-DC conversion: analytical modeling and numerical validation

    NASA Astrophysics Data System (ADS)

    Aghakhani, Amirreza; Basdogan, Ipek; Erturk, Alper

    2016-04-01

    Plate-like components are widely used in numerous automotive, marine, and aerospace applications where they can be employed as host structures for vibration based energy harvesting. Piezoelectric patch harvesters can be easily attached to these structures to convert the vibrational energy to the electrical energy. Power output investigations of these harvesters require accurate models for energy harvesting performance evaluation and optimization. Equivalent circuit modeling of the cantilever-based vibration energy harvesters for estimation of electrical response has been proposed in recent years. However, equivalent circuit formulation and analytical modeling of multiple piezo-patch energy harvesters integrated to thin plates including nonlinear circuits has not been studied. In this study, equivalent circuit model of multiple parallel piezoelectric patch harvesters together with a resistive load is built in electronic circuit simulation software SPICE and voltage frequency response functions (FRFs) are validated using the analytical distributedparameter model. Analytical formulation of the piezoelectric patches in parallel configuration for the DC voltage output is derived while the patches are connected to a standard AC-DC circuit. The analytic model is based on the equivalent load impedance approach for piezoelectric capacitance and AC-DC circuit elements. The analytic results are validated numerically via SPICE simulations. Finally, DC power outputs of the harvesters are computed and compared with the peak power amplitudes in the AC output case.

  12. Iterative Usage of Fixed and Random Effect Models for Powerful and Efficient Genome-Wide Association Studies

    PubMed Central

    Liu, Xiaolei; Huang, Meng; Fan, Bin; Buckler, Edward S.; Zhang, Zhiwu

    2016-01-01

    False positives in a Genome-Wide Association Study (GWAS) can be effectively controlled by a fixed effect and random effect Mixed Linear Model (MLM) that incorporates population structure and kinship among individuals to adjust association tests on markers; however, the adjustment also compromises true positives. The modified MLM method, Multiple Loci Linear Mixed Model (MLMM), incorporates multiple markers simultaneously as covariates in a stepwise MLM to partially remove the confounding between testing markers and kinship. To completely eliminate the confounding, we divided MLMM into two parts: Fixed Effect Model (FEM) and a Random Effect Model (REM) and use them iteratively. FEM contains testing markers, one at a time, and multiple associated markers as covariates to control false positives. To avoid model over-fitting problem in FEM, the associated markers are estimated in REM by using them to define kinship. The P values of testing markers and the associated markers are unified at each iteration. We named the new method as Fixed and random model Circulating Probability Unification (FarmCPU). Both real and simulated data analyses demonstrated that FarmCPU improves statistical power compared to current methods. Additional benefits include an efficient computing time that is linear to both number of individuals and number of markers. Now, a dataset with half million individuals and half million markers can be analyzed within three days. PMID:26828793

  13. Enhanced project management tool

    NASA Technical Reports Server (NTRS)

    Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)

    2012-01-01

    A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.

  14. Geospatial interface and model for predicting potential seagrass habitat

    EPA Science Inventory

    Restoration of ecosystem services provided by seagrass habitats in estuaries requires a clear understanding of the modes of action of multiple interacting stressors including nutrients, climate change, coastal land-use change, and habitat modification. We have developed a geos...

  15. STABLE ISOTOPES IN ECOLOGICAL STUDIES: NEW DEVELOPMENTS IN MIXING MODELS

    EPA Science Inventory

    Stable isotopes are increasingly being used as tracers in ecological studies. One application uses isotopic ratios to quantify the proportional contributions of multiple sources to a mixture. Examples include food sources for animals, water sources for plants, pollution sources...

  16. A generalized model on the effects of nanoparticles on fluorophore fluorescence in solution

    USDA-ARS?s Scientific Manuscript database

    Nanoparticles (NP) can modify fluorophore fluorescence in solution through multiple pathways that include fluorescence inner filter effect (IFE), dynamic and static quenching, surface enhancement, and fluorophore quantum yield variation associated with structural and conformational modifications ind...

  17. Quantitative Models for the Narragansett Bay Estuary, Rhode Island/Massachusetts, USA

    EPA Science Inventory

    Multiple drivers, including nutrient loading and climate change, affect the Narragansett Bay ecosystem in Rhode Island/Massachusetts, USA. Managers are interested in understanding the timing and magnitude of these effects, and ecosystem responses to restoration actions. To provid...

  18. Using Gaming To Help Nursing Students Understand Ethics.

    ERIC Educational Resources Information Center

    Metcalf, Barbara L.; Yankou, Dawn

    2003-01-01

    An ethics game involves nursing students in defending actions in ethics-based scenarios. Benefits include increased confidence, ability to see multiple perspectives, values clarification, and exposure to decision-making models, professional responsibilities, ethical principles, social expectations, and legal requirements. Difficulties include…

  19. Generation of the Dimensional Embryology Application (App) for Visualization of Early Chick and Frog Embryonic Development

    ERIC Educational Resources Information Center

    Webb, Rebecca L.; Bilitski, James; Zerbee, Alyssa; Symans, Alexandra; Chop, Alexandra; Seitz, Brianne; Tran, Cindy

    2015-01-01

    The study of embryonic development of multiple organisms, including model organisms such as frogs and chicks, is included in many undergraduate biology programs, as well as in a variety of graduate programs. As our knowledge of biological systems increases and the amount of material to be taught expands, the time spent instructing students about…

  20. Recent results from PHOBOS at RHIC

    NASA Astrophysics Data System (ADS)

    Back, B. B.; Baker, M. D.; Barton, D. S.; Betts, R. R.; Ballintijn, M.; Bickley, A. A.; Bindel, R.; Budzanowski, A.; Busza, W.; Carroll, A.; Decowski, M. P.; García, E.; George, N.; Gulbrandsen, K.; Gushue, S.; Halliwell, C.; Hamblen, J.; Heintzelman, G. A.; Henderson, C.; Hofman, D. J.; Hollis, R. S.; Hołyński, R.; Holzman, B.; Iordanova, A.; Johnson, E.; Kane, J. L.; Katzy, J.; Khan, N.; Kucewicz, W.; Kulinich, P.; Kuo, C. M.; Lin, W. T.; Manly, S.; McLeod, D.; Michałowski, J.; Mignerey, A. C.; Nouicer, R.; Olszewski, A.; Pak, R.; Park, I. C.; Pernegger, H.; Reed, C.; Remsberg, L. P.; Reuter, M.; Roland, C.; Roland, G.; Rosenberg, L.; Sagerer, J.; Sarin, P.; Sawicki, P.; Skulski, W.; Steadman, S. G.; Steinberg, P.; Stephans, G. S. F.; Stodulski, M.; Sukhanov, A.; Tang, J.-L.; Teng, R.; Trzupek, A.; Vale, C.; van Niewwenhuizen, G. J.; Verdier, R.; Wadsworth, B.; Wolfs, F. L. H.; Wosiek, B.; Woźniak, K.; Wuosmaa, A. H.; Wysłouch, B.; Robert PakThe Phobos Collaboration

    2003-06-01

    The PHOBOS experiment at RHIC has recorded measurements for AuAu collisions spanning nucleon-nucleon center-of-mass energies from √ SNN = 19.6 GeV to 200 GeV. Global observables such as elliptic flow and charged particle multiplicity provide important constraints on model predictions that characterize the state of matter produced in these collisions. The nearly 4π acceptance of the PHOBOS experiment provides excellent coverage for complete flow and multiplicity measurements. Results including beam energy and centrality dependencies are presented and compared to elementary systems.

  1. Multiple collision effects on the antiproton production by high energy proton (100 GeV - 1000 GeV)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, Hiroshi; Powell, J.

    Antiproton production rates which take into account multiple collision are calculated using a simple model. Methods to reduce capture of the produced antiprotons by the target are discussed, including geometry of target and the use of a high intensity laser. Antiproton production increases substantially above 150 GeV proton incident energy. The yield increases almost linearly with incident energy, alleviating space charge problems in the high current accelerator that produces large amounts of antiprotons.

  2. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  3. Using a Mechanistic Reactive Transport Model to Represent Soil Organic Matter Dynamics and Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Guerry, N.; Riley, W. J.; Maggi, F.; Torn, M. S.; Kleber, M.

    2011-12-01

    The nature of long term Soil Organic Matter (SOM) dynamics is uncertain and the mechanisms involved are crudely represented in site, regional, and global models. Recent work challenging the paradigm that SOM is stabilized because of its sequential transformations to more intrinsically recalcitrant compounds motivated us to develop a mechanistic modeling framework that can be used to test hypotheses of SOM dynamics. We developed our C cycling model in TOUGHREACT, an established 3-dimensional reactive transport solver that accounts for multiple phases (aqueous, gaseous, sorbed), multiple species, advection and diffusion, and multiple microbial populations. Energy and mass exchange through the soil boundaries are accounted for via ground heat flux, rainfall, C sources (e.g., exudation, woody, leaf, root litter) and C losses (e.g., CO2 emissions and DOC deep percolation). SOM is categorized according to the various types of compounds commonly found in the above mentioned C sources and microbial byproducts, including poly- and monosaccharides, lignin, amino compounds, organic acids, nucleic acids, lipids, and phenols. Each of these compounds is accounted for by one or more representative species in the model. A reaction network was developed to describe the microbially-mediated processes and chemical interactions of these species, including depolymerization, microbial assimilation, respiration and deposition of byproducts, and incorporation of dead biomass into SOM stocks. Enzymatic reactions are characterized by Michaelis-Menten kinetics, with maximum reaction rates determined by the species' O/C ratio. Microbial activity is further regulated by soil moisture content, O2 availability, pH, and temperature. For the initial set of simulations, literature values were used to constrain microbial Monod parameters, Michaelis-Menten parameters, sorption parameters, physical protection, partitioning of microbial byproducts, and partitioning of litter inputs, although there is substantial uncertainty in how these relationships should be represented. We also developed several other model formulations, including one that represents SOM in pools of varying decomposability, but lacking explicit protection mechanisms. We tested the model against several observational and experimental datasets. An important conclusion of our analysis is that although several of the model structural formulations were able to represent the bulk SOM observations, including 14C vertical profiles, the temperature, moisture, and soil chemistry sensitivity of decomposition varied strongly between each formulation. Finally, we applied the model to design observations that would be required to better constrain process representation and improve predictions of changes in SOM under changing climate.

  4. Post-processing method for wind speed ensemble forecast using wind speed and direction

    NASA Astrophysics Data System (ADS)

    Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin

    2017-04-01

    Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.

  5. Action control and situational risks in the prevention of HIV and STIs: individual, dyadic, and social influences on consistent condom use in a university population.

    PubMed

    Svenson, Gary R; Ostergren, Per-Olof; Merlo, Juan; Råstam, Lennart

    2002-12-01

    The aim of this study was to gain an understanding of consistent condom use. We took the perspective that condom use involves the ability to handle situational risks influenced at multiple levels, including the individual, dyadic, and social. The hypothesis was that action control, as measured by self-regulation, implementation intentions, and self-efficacy, was the primary determinant. The study was conducted at part of a community-based intervention at a major university (36,000 students). Data was collected using a validated questionnaire mailed to a random sample of students (n = 493, response rate = 71.5%). Statistical analysis included logistic regression models that successively included background, individual, dyadic, and social variables. In the final model, consistent condom use was higher among students with strong implementation intentions, high self-regulation and positive peer norms. The results contribute new knowledge on action control in predicting sexual risk behaviors and lends support to the conceptualization and analysis of HIV/sexually transmitted infection prevention at multiple levels of influence.

  6. Unleashing the power of human genetic variation knowledge: New Zealand stakeholder perspectives.

    PubMed

    Gu, Yulong; Warren, James Roy; Day, Karen Jean

    2011-01-01

    This study aimed to characterize the challenges in using genetic information in health care and to identify opportunities for improvement. Taking a grounded theory approach, semistructured interviews were conducted with 48 participants to collect multiple stakeholder perspectives on genetic services in New Zealand. Three themes emerged from the data: (1) four service delivery models were identified in operation, including both those expected models involving genetic counselors and variations that do not route through the formal genetic service program; (2) multiple barriers to sharing and using genetic information were perceived, including technological, organizational, institutional, legal, ethical, and social issues; and (3) impediments to wider use of genetic testing technology, including variable understanding of genetic test utilities among clinicians and the limited capacity of clinical genetic services. Targeting these problems, information technologies and knowledge management tools have the potential to support key tasks in genetic services delivery, improve knowledge processes, and enhance knowledge networks. Because of the effect of issues in genetic information and knowledge management, the potential of human genetic variation knowledge to enhance health care delivery has been put on a "leash."

  7. Multiple transmitter performance with appropriate amplitude modulation for free-space optical communication.

    PubMed

    Tellez, Jason A; Schmidt, Jason D

    2011-08-20

    The propagation of a free-space optical communications signal through atmospheric turbulence experiences random fluctuations in intensity, including signal fades, which negatively impact the performance of the communications link. The gamma-gamma probability density function is commonly used to model the scintillation of a single beam. One proposed method to reduce the occurrence of scintillation-induced fades at the receiver plane involves the use of multiple beams propagating through independent paths, resulting in a sum of independent gamma-gamma random variables. Recently an analytical model for the probability distribution of irradiance from the sum of multiple independent beams was developed. Because truly independent beams are practically impossible to create, we present here a more general but approximate model for the distribution of beams traveling through partially correlated paths. This model compares favorably with wave-optics simulations and highlights the reduced scintillation as the number of transmitted beams is increased. Additionally, a pulse-position modulation scheme is used to reduce the impact of signal fades when they occur. Analytical and simulated results showed significantly improved performance when compared to fixed threshold on/off keying. © 2011 Optical Society of America

  8. A minimalist model protein with multiple folding funnels

    PubMed Central

    Locker, C. Rebecca; Hernandez, Rigoberto

    2001-01-01

    Kinetic and structural studies of wild-type proteins such as prions and amyloidogenic proteins provide suggestive evidence that proteins may adopt multiple long-lived states in addition to the native state. All of these states differ structurally because they lie far apart in configuration space, but their stability is not necessarily caused by cooperative (nucleation) effects. In this study, a minimalist model protein is designed to exhibit multiple long-lived states to explore the dynamics of the corresponding wild-type proteins. The minimalist protein is modeled as a 27-monomer sequence confined to a cubic lattice with three different monomer types. An order parameter—the winding index—is introduced to characterize the extent of folding. The winding index has several advantages over other commonly used order parameters like the number of native contacts. It can distinguish between enantiomers, its calculation requires less computational time than the number of native contacts, and reduced-dimensional landscapes can be developed when the native state structure is not known a priori. The results for the designed model protein prove by existence that the rugged energy landscape picture of protein folding can be generalized to include protein “misfolding” into long-lived states. PMID:11470921

  9. Measuring University students' understanding of the greenhouse effect - a comparison of multiple-choice, short answer and concept sketch assessment tools with respect to students' mental models

    NASA Astrophysics Data System (ADS)

    Gold, A. U.; Harris, S. E.

    2013-12-01

    The greenhouse effect comes up in most discussions about climate and is a key concept related to climate change. Existing studies have shown that students and adults alike lack a detailed understanding of this important concept or might hold misconceptions. We studied the effectiveness of different interventions on University-level students' understanding of the greenhouse effect. Introductory level science students were tested for their pre-knowledge of the greenhouse effect using validated multiple-choice questions, short answers and concept sketches. All students participated in a common lesson about the greenhouse effect and were then randomly assigned to one of two lab groups. One group explored an existing simulation about the greenhouse effect (PhET-lesson) and the other group worked with absorption spectra of different greenhouse gases (Data-lesson) to deepen the understanding of the greenhouse effect. All students completed the same assessment including multiple choice, short answers and concept sketches after participation in their lab lesson. 164 students completed all the assessments, 76 completed the PhET lesson and 77 completed the data lesson. 11 students missed the contrasting lesson. In this presentation we show the comparison between the multiple-choice questions, short answer questions and the concept sketches of students. We explore how well each of these assessment types represents student's knowledge. We also identify items that are indicators of the level of understanding of the greenhouse effect as measured in correspondence of student answers to an expert mental model and expert responses. Preliminary data analysis shows that student who produce concept sketch drawings that come close to expert drawings also choose correct multiple-choice answers. However, correct multiple-choice answers are not necessarily an indicator that a student produces an expert-like correlating concept sketch items. Multiple-choice questions that require detailed knowledge of the greenhouse effect (e.g. direction of re-emission of infrared energy from greenhouse gas) are significantly more likely to be answered correctly by students who also produce expert-like concept sketch items than by students who don't include this aspect in their sketch and don't answer the multiple choice questions correctly. This difference is not as apparent for less technical multiple-choice questions (e.g. type of radiation emitted by Sun). Our findings explore the formation of student's mental models throughout different interventions and how well the different assessment techniques used in this study represent the student understanding of the overall concept.

  10. Trainable Gene Regulation Networks with Applications to Drosophila Pattern Formation

    NASA Technical Reports Server (NTRS)

    Mjolsness, Eric

    2000-01-01

    This chapter will very briefly introduce and review some computational experiments in using trainable gene regulation network models to simulate and understand selected episodes in the development of the fruit fly, Drosophila melanogaster. For details the reader is referred to the papers introduced below. It will then introduce a new gene regulation network model which can describe promoter-level substructure in gene regulation. As described in chapter 2, gene regulation may be thought of as a combination of cis-acting regulation by the extended promoter of a gene (including all regulatory sequences) by way of the transcription complex, and of trans-acting regulation by the transcription factor products of other genes. If we simplify the cis-action by using a phenomenological model which can be tuned to data, such as a unit or other small portion of an artificial neural network, then the full transacting interaction between multiple genes during development can be modelled as a larger network which can again be tuned or trained to data. The larger network will in general need to have recurrent (feedback) connections since at least some real gene regulation networks do. This is the basic modeling approach taken, which describes how a set of recurrent neural networks can be used as a modeling language for multiple developmental processes including gene regulation within a single cell, cell-cell communication, and cell division. Such network models have been called "gene circuits", "gene regulation networks", or "genetic regulatory networks", sometimes without distinguishing the models from the actual modeled systems.

  11. Dataset for petroleum based stock markets and GAUSS codes for SAMEM.

    PubMed

    Khalifa, Ahmed A A; Bertuccelli, Pietro; Otranto, Edoardo

    2017-02-01

    This article includes a unique data set of a balanced daily (Monday, Tuesday and Wednesday) for oil and natural gas volatility and the oil rich economies' stock markets for Saudi Arabia, Qatar, Kuwait, Abu Dhabi, Dubai, Bahrain and Oman, using daily data over the period spanning Oct. 18, 2006-July 30, 2015. Additionally, we have included unique GAUSS codes for estimating the spillover asymmetric multiplicative error model (SAMEM) with application to Petroleum-Based Stock Market. The data, the model and the codes have many applications in business and social science.

  12. Can Drosophila melanogaster represent a model system for the detection of reproductive adverse drug reactions?

    PubMed

    Avanesian, Agnesa; Semnani, Sahar; Jafari, Mahtab

    2009-08-01

    Once a molecule is identified as a potential drug, the detection of adverse drug reactions is one of the key components of its development and the FDA approval process. We propose using Drosophila melanogaster to screen for reproductive adverse drug reactions in the early stages of drug development. Compared with other non-mammalian models, D. melanogaster has many similarities to the mammalian reproductive system, including putative sex hormones and conserved proteins involved in genitourinary development. Furthermore, the D. melanogaster model would present significant advantages in time efficiency and cost-effectiveness compared with mammalian models. We present data on methotrexate (MTX) reproductive adverse events in multiple animal models, including fruit flies, as proof-of-concept for the use of the D. melanogaster model.

  13. Learning the organization: a model for health system analysis for new nurse administrators.

    PubMed

    Clark, Mary Jo

    2004-01-01

    Health systems are large and complex organizations in which multiple components and processes influence system outcomes. In order to effectively position themselves in such organizations, nurse administrators new to a system must gain a rapid understanding of overall system operation. Such understanding is facilitated by use of a model for system analysis. The model presented here examines the dynamic interrelationships between and among internal and external elements as they affect system performance. External elements to be analyzed include environmental factors and characteristics of system clientele. Internal elements flow from the mission and goals of the system and include system culture, services, resources, and outcomes.

  14. Assessing uncertainty and sensitivity of model parameterizations and parameters in WRF affecting simulated surface fluxes and land-atmosphere coupling over the Amazon region

    NASA Astrophysics Data System (ADS)

    Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.

    2016-12-01

    This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.

  15. Screening Models of Aquifer Heterogeneity Using the Flow Dimension

    NASA Astrophysics Data System (ADS)

    Walker, D. D.; Cello, P. A.; Roberts, R. M.; Valocchi, A. J.

    2007-12-01

    Despite advances in test interpretation and modeling, typical groundwater modeling studies only indirectly use the parameters and information inferred from hydraulic tests. In particular, the Generalized Radial Flow approach to test interpretation infers the flow dimension, a parameter describing the geometry of the flow field during a hydraulic test. Noninteger values of the flow dimension often are inferred for tests in highly heterogeneous aquifers, yet subsequent modeling studies typically ignore the flow dimension. Monte Carlo analyses of detailed numerical models of aquifer tests examine the flow dimension for several stochastic models of heterogeneous transmissivity, T(x). These include multivariate lognormal, fractional Brownian motion, a site percolation network, and discrete linear features with lengths distributed as power-law. The behavior of the simulated flow dimensions are compared to the flow dimensions observed for multiple aquifer tests in a fractured dolomite aquifer in the Great Lakes region of North America. The combination of multiple hydraulic tests, observed fracture patterns, and the Monte Carlo results are used to screen models of heterogeneity and their parameters for subsequent groundwater flow modeling.

  16. Quantitative assessment of cervical vertebral maturation using cone beam computed tomography in Korean girls.

    PubMed

    Byun, Bo-Ram; Kim, Yong-Il; Yamaguchi, Tetsutaro; Maki, Koutaro; Son, Woo-Sung

    2015-01-01

    This study was aimed to examine the correlation between skeletal maturation status and parameters from the odontoid process/body of the second vertebra and the bodies of third and fourth cervical vertebrae and simultaneously build multiple regression models to be able to estimate skeletal maturation status in Korean girls. Hand-wrist radiographs and cone beam computed tomography (CBCT) images were obtained from 74 Korean girls (6-18 years of age). CBCT-generated cervical vertebral maturation (CVM) was used to demarcate the odontoid process and the body of the second cervical vertebra, based on the dentocentral synchondrosis. Correlation coefficient analysis and multiple linear regression analysis were used for each parameter of the cervical vertebrae (P < 0.05). Forty-seven of 64 parameters from CBCT-generated CVM (independent variables) exhibited statistically significant correlations (P < 0.05). The multiple regression model with the greatest R (2) had six parameters (PH2/W2, UW2/W2, (OH+AH2)/LW2, UW3/LW3, D3, and H4/W4) as independent variables with a variance inflation factor (VIF) of <2. CBCT-generated CVM was able to include parameters from the second cervical vertebral body and odontoid process, respectively, for the multiple regression models. This suggests that quantitative analysis might be used to estimate skeletal maturation status.

  17. Modeling the Potential Effects of New Tobacco Products and Policies. A Dynamic Population Model for Multiple Product Use and Harm

    DOE PAGES

    Vugrin, Eric D.; Rostron, Brian L.; Verzi, Stephen J.; ...

    2015-03-27

    Background Recent declines in US cigarette smoking prevalence have coincided with increases in use of other tobacco products. Multiple product tobacco models can help assess the population health impacts associated with use of a wide range of tobacco products. Methods and Findings We present a multi-state, dynamical systems population structure model that can be used to assess the effects of tobacco product use behaviors on population health. The model incorporates transition behaviors, such as initiation, cessation, switching, and dual use, related to the use of multiple products. The model tracks product use prevalence and mortality attributable to tobacco use formore » the overall population and by sex and age group. The model can also be used to estimate differences in these outcomes between scenarios by varying input parameter values. We demonstrate model capabilities by projecting future cigarette smoking prevalence and smoking-attributable mortality and then simulating the effects of introduction of a hypothetical new lower-risk tobacco product under a variety of assumptions about product use. Sensitivity analyses were conducted to examine the range of population impacts that could occur due to differences in input values for product use and risk. We demonstrate that potential benefits from cigarette smokers switching to the lower-risk product can be offset over time through increased initiation of this product. Model results show that population health benefits are particularly sensitive to product risks and initiation, switching, and dual use behaviors. Conclusion Our model incorporates the variety of tobacco use behaviors and risks that occur with multiple products. As such, it can evaluate the population health impacts associated with the introduction of new tobacco products or policies that may result in product switching or dual use. Further model development will include refinement of data inputs for non-cigarette tobacco products and inclusion of health outcomes such as morbidity and disability.« less

  18. Modeling the Potential Effects of New Tobacco Products and Policies: A Dynamic Population Model for Multiple Product Use and Harm

    PubMed Central

    Vugrin, Eric D.; Rostron, Brian L.; Verzi, Stephen J.; Brodsky, Nancy S.; Brown, Theresa J.; Choiniere, Conrad J.; Coleman, Blair N.; Paredes, Antonio; Apelberg, Benjamin J.

    2015-01-01

    Background Recent declines in US cigarette smoking prevalence have coincided with increases in use of other tobacco products. Multiple product tobacco models can help assess the population health impacts associated with use of a wide range of tobacco products. Methods and Findings We present a multi-state, dynamical systems population structure model that can be used to assess the effects of tobacco product use behaviors on population health. The model incorporates transition behaviors, such as initiation, cessation, switching, and dual use, related to the use of multiple products. The model tracks product use prevalence and mortality attributable to tobacco use for the overall population and by sex and age group. The model can also be used to estimate differences in these outcomes between scenarios by varying input parameter values. We demonstrate model capabilities by projecting future cigarette smoking prevalence and smoking-attributable mortality and then simulating the effects of introduction of a hypothetical new lower-risk tobacco product under a variety of assumptions about product use. Sensitivity analyses were conducted to examine the range of population impacts that could occur due to differences in input values for product use and risk. We demonstrate that potential benefits from cigarette smokers switching to the lower-risk product can be offset over time through increased initiation of this product. Model results show that population health benefits are particularly sensitive to product risks and initiation, switching, and dual use behaviors. Conclusion Our model incorporates the variety of tobacco use behaviors and risks that occur with multiple products. As such, it can evaluate the population health impacts associated with the introduction of new tobacco products or policies that may result in product switching or dual use. Further model development will include refinement of data inputs for non-cigarette tobacco products and inclusion of health outcomes such as morbidity and disability. PMID:25815840

  19. Modeling the potential effects of new tobacco products and policies: a dynamic population model for multiple product use and harm.

    PubMed

    Vugrin, Eric D; Rostron, Brian L; Verzi, Stephen J; Brodsky, Nancy S; Brown, Theresa J; Choiniere, Conrad J; Coleman, Blair N; Paredes, Antonio; Apelberg, Benjamin J

    2015-01-01

    Recent declines in US cigarette smoking prevalence have coincided with increases in use of other tobacco products. Multiple product tobacco models can help assess the population health impacts associated with use of a wide range of tobacco products. We present a multi-state, dynamical systems population structure model that can be used to assess the effects of tobacco product use behaviors on population health. The model incorporates transition behaviors, such as initiation, cessation, switching, and dual use, related to the use of multiple products. The model tracks product use prevalence and mortality attributable to tobacco use for the overall population and by sex and age group. The model can also be used to estimate differences in these outcomes between scenarios by varying input parameter values. We demonstrate model capabilities by projecting future cigarette smoking prevalence and smoking-attributable mortality and then simulating the effects of introduction of a hypothetical new lower-risk tobacco product under a variety of assumptions about product use. Sensitivity analyses were conducted to examine the range of population impacts that could occur due to differences in input values for product use and risk. We demonstrate that potential benefits from cigarette smokers switching to the lower-risk product can be offset over time through increased initiation of this product. Model results show that population health benefits are particularly sensitive to product risks and initiation, switching, and dual use behaviors. Our model incorporates the variety of tobacco use behaviors and risks that occur with multiple products. As such, it can evaluate the population health impacts associated with the introduction of new tobacco products or policies that may result in product switching or dual use. Further model development will include refinement of data inputs for non-cigarette tobacco products and inclusion of health outcomes such as morbidity and disability.

  20. Modeling the Potential Effects of New Tobacco Products and Policies. A Dynamic Population Model for Multiple Product Use and Harm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vugrin, Eric D.; Rostron, Brian L.; Verzi, Stephen J.

    Background Recent declines in US cigarette smoking prevalence have coincided with increases in use of other tobacco products. Multiple product tobacco models can help assess the population health impacts associated with use of a wide range of tobacco products. Methods and Findings We present a multi-state, dynamical systems population structure model that can be used to assess the effects of tobacco product use behaviors on population health. The model incorporates transition behaviors, such as initiation, cessation, switching, and dual use, related to the use of multiple products. The model tracks product use prevalence and mortality attributable to tobacco use formore » the overall population and by sex and age group. The model can also be used to estimate differences in these outcomes between scenarios by varying input parameter values. We demonstrate model capabilities by projecting future cigarette smoking prevalence and smoking-attributable mortality and then simulating the effects of introduction of a hypothetical new lower-risk tobacco product under a variety of assumptions about product use. Sensitivity analyses were conducted to examine the range of population impacts that could occur due to differences in input values for product use and risk. We demonstrate that potential benefits from cigarette smokers switching to the lower-risk product can be offset over time through increased initiation of this product. Model results show that population health benefits are particularly sensitive to product risks and initiation, switching, and dual use behaviors. Conclusion Our model incorporates the variety of tobacco use behaviors and risks that occur with multiple products. As such, it can evaluate the population health impacts associated with the introduction of new tobacco products or policies that may result in product switching or dual use. Further model development will include refinement of data inputs for non-cigarette tobacco products and inclusion of health outcomes such as morbidity and disability.« less

  1. Multiple re-encounter approach to radical pair reactions and the role of nonlinear master equations.

    PubMed

    Clausen, Jens; Guerreschi, Gian Giacomo; Tiersch, Markus; Briegel, Hans J

    2014-08-07

    We formulate a multiple-encounter model of the radical pair mechanism that is based on a random coupling of the radical pair to a minimal model environment. These occasional pulse-like couplings correspond to the radical encounters and give rise to both dephasing and recombination. While this is in agreement with the original model of Haberkorn and its extensions that assume additional dephasing, we show how a nonlinear master equation may be constructed to describe the conditional evolution of the radical pairs prior to the detection of their recombination. We propose a nonlinear master equation for the evolution of an ensemble of independently evolving radical pairs whose nonlinearity depends on the record of the fluorescence signal. We also reformulate Haberkorn's original argument on the physicality of reaction operators using the terminology of quantum optics/open quantum systems. Our model allows one to describe multiple encounters within the exponential model and connects this with the master equation approach. We include hitherto neglected effects of the encounters, such as a separate dephasing in the triplet subspace, and predict potential new effects, such as Grover reflections of radical spins, that may be observed if the strength and time of the encounters can be experimentally controlled.

  2. Spatio-Temporal Data Model for Integrating Evolving Nation-Level Datasets

    NASA Astrophysics Data System (ADS)

    Sorokine, A.; Stewart, R. N.

    2017-10-01

    Ability to easily combine the data from diverse sources in a single analytical workflow is one of the greatest promises of the Big Data technologies. However, such integration is often challenging as datasets originate from different vendors, governments, and research communities that results in multiple incompatibilities including data representations, formats, and semantics. Semantics differences are hardest to handle: different communities often use different attribute definitions and associate the records with different sets of evolving geographic entities. Analysis of global socioeconomic variables across multiple datasets over prolonged time is often complicated by the difference in how boundaries and histories of countries or other geographic entities are represented. Here we propose an event-based data model for depicting and tracking histories of evolving geographic units (countries, provinces, etc.) and their representations in disparate data. The model addresses the semantic challenge of preserving identity of geographic entities over time by defining criteria for the entity existence, a set of events that may affect its existence, and rules for mapping between different representations (datasets). Proposed model is used for maintaining an evolving compound database of global socioeconomic and environmental data harvested from multiple sources. Practical implementation of our model is demonstrated using PostgreSQL object-relational database with the use of temporal, geospatial, and NoSQL database extensions.

  3. Coordinated learning of grid cell and place cell spatial and temporal properties: multiple scales, attention and oscillations.

    PubMed

    Grossberg, Stephen; Pilly, Praveen K

    2014-02-05

    A neural model proposes how entorhinal grid cells and hippocampal place cells may develop as spatial categories in a hierarchy of self-organizing maps (SOMs). The model responds to realistic rat navigational trajectories by learning both grid cells with hexagonal grid firing fields of multiple spatial scales, and place cells with one or more firing fields, that match neurophysiological data about their development in juvenile rats. Both grid and place cells can develop by detecting, learning and remembering the most frequent and energetic co-occurrences of their inputs. The model's parsimonious properties include: similar ring attractor mechanisms process linear and angular path integration inputs that drive map learning; the same SOM mechanisms can learn grid cell and place cell receptive fields; and the learning of the dorsoventral organization of multiple spatial scale modules through medial entorhinal cortex to hippocampus (HC) may use mechanisms homologous to those for temporal learning through lateral entorhinal cortex to HC ('neural relativity'). The model clarifies how top-down HC-to-entorhinal attentional mechanisms may stabilize map learning, simulates how hippocampal inactivation may disrupt grid cells, and explains data about theta, beta and gamma oscillations. The article also compares the three main types of grid cell models in the light of recent data.

  4. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  5. Integrated presentation of ecological risk from multiple stressors

    NASA Astrophysics Data System (ADS)

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-10-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  6. Integrated presentation of ecological risk from multiple stressors.

    PubMed

    Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman

    2016-10-26

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  7. Diverse ways of perturbing the human arachidonic acid metabolic network to control inflammation.

    PubMed

    Meng, Hu; Liu, Ying; Lai, Luhua

    2015-08-18

    Inflammation and other common disorders including diabetes, cardiovascular disease, and cancer are often the result of several molecular abnormalities and are not likely to be resolved by a traditional single-target drug discovery approach. Though inflammation is a normal bodily reaction, uncontrolled and misdirected inflammation can cause inflammatory diseases such as rheumatoid arthritis and asthma. Nonsteroidal anti-inflammatory drugs including aspirin, ibuprofen, naproxen, or celecoxib are commonly used to relieve aches and pains, but often these drugs have undesirable and sometimes even fatal side effects. To facilitate safer and more effective anti-inflammatory drug discovery, a balanced treatment strategy should be developed at the biological network level. In this Account, we focus on our recent progress in modeling the inflammation-related arachidonic acid (AA) metabolic network and subsequent multiple drug design. We first constructed a mathematical model of inflammation based on experimental data and then applied the model to simulate the effects of commonly used anti-inflammatory drugs. Our results indicated that the model correctly reproduced the established bleeding and cardiovascular side effects. Multitarget optimal intervention (MTOI), a Monte Carlo simulated annealing based computational scheme, was then developed to identify key targets and optimal solutions for controlling inflammation. A number of optimal multitarget strategies were discovered that were both effective and safe and had minimal associated side effects. Experimental studies were performed to evaluate these multitarget control solutions further using different combinations of inhibitors to perturb the network. Consequently, simultaneous control of cyclooxygenase-1 and -2 and leukotriene A4 hydrolase, as well as 5-lipoxygenase and prostaglandin E2 synthase were found to be among the best solutions. A single compound that can bind multiple targets presents advantages including low risk of drug-drug interactions and robustness regarding concentration fluctuations. Thus, we developed strategies for multiple-target drug design and successfully discovered several series of multiple-target inhibitors. Optimal solutions for a disease network often involve mild but simultaneous interventions of multiple targets, which is in accord with the philosophy of traditional Chinese medicine (TCM). To this end, our AA network model can aptly explain TCM anti-inflammatory herbs and formulas at the molecular level. We also aimed to identify activators for several enzymes that appeared to have increased activity based on MTOI outcomes. Strategies were then developed to predict potential allosteric sites and to discover enzyme activators based on our hypothesis that combined treatment with the projected activators and inhibitors could balance different AA network pathways, control inflammation, and reduce associated adverse effects. Our work demonstrates that the integration of network modeling and drug discovery can provide novel solutions for disease control, which also calls for new developments in drug design concepts and methodologies. With the rapid accumulation of quantitative data and knowledge of the molecular networks of disease, we can expect an increase in the development and use of quantitative disease models to facilitate efficient and safe drug discovery.

  8. Group-oriented coordination models for distributed client-server computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Hughes, Craig S.

    1994-01-01

    This paper describes group-oriented control models for distributed client-server interactions. These models transparently coordinate requests for services that involve multiple servers, such as queries across distributed databases. Specific capabilities include: decomposing and replicating client requests; dispatching request subtasks or copies to independent, networked servers; and combining server results into a single response for the client. The control models were implemented by combining request broker and process group technologies with an object-oriented communication middleware tool. The models are illustrated in the context of a distributed operations support application for space-based systems.

  9. Statistical design and analysis for plant cover studies with multiple sources of observation errors

    USGS Publications Warehouse

    Wright, Wilson; Irvine, Kathryn M.; Warren, Jeffrey M .; Barnett, Jenny K.

    2017-01-01

    Effective wildlife habitat management and conservation requires understanding the factors influencing distribution and abundance of plant species. Field studies, however, have documented observation errors in visually estimated plant cover including measurements which differ from the true value (measurement error) and not observing a species that is present within a plot (detection error). Unlike the rapid expansion of occupancy and N-mixture models for analysing wildlife surveys, development of statistical models accounting for observation error in plants has not progressed quickly. Our work informs development of a monitoring protocol for managed wetlands within the National Wildlife Refuge System.Zero-augmented beta (ZAB) regression is the most suitable method for analysing areal plant cover recorded as a continuous proportion but assumes no observation errors. We present a model extension that explicitly includes the observation process thereby accounting for both measurement and detection errors. Using simulations, we compare our approach to a ZAB regression that ignores observation errors (naïve model) and an “ad hoc” approach using a composite of multiple observations per plot within the naïve model. We explore how sample size and within-season revisit design affect the ability to detect a change in mean plant cover between 2 years using our model.Explicitly modelling the observation process within our framework produced unbiased estimates and nominal coverage of model parameters. The naïve and “ad hoc” approaches resulted in underestimation of occurrence and overestimation of mean cover. The degree of bias was primarily driven by imperfect detection and its relationship with cover within a plot. Conversely, measurement error had minimal impacts on inferences. We found >30 plots with at least three within-season revisits achieved reasonable posterior probabilities for assessing change in mean plant cover.For rapid adoption and application, code for Bayesian estimation of our single-species ZAB with errors model is included. Practitioners utilizing our R-based simulation code can explore trade-offs among different survey efforts and parameter values, as we did, but tuned to their own investigation. Less abundant plant species of high ecological interest may warrant the additional cost of gathering multiple independent observations in order to guard against erroneous conclusions.

  10. An educational rationale for deaf students with multiple disabilities.

    PubMed

    Ewing, Karen M; Jones, Thomas W

    2003-01-01

    Deaf students with with multiple disabilities have a long history of limited opportunity, including limited access to educational opportunities available to their deaf peers. This article places the individual needs of deaf students with multiple disabilities in the context that guides much of deaf education--the importance of language acquisition. That emphasis provides a basis for placement and curriculum options for deaf students with multiple disabilities. The authors review the evolution of placement options, describe assumptions that should guide placement and curriculum decisions, and recommend practices for optimizing these students' education. Descriptions of three service delivery models--multidisciplinary, interdisciplinary, and transdisciplinary--are provided, as well as an overview of the effectiveness of person-centered planning for deaf students with multiple disabilities. Disability-specific resources are highlighted that relate to mental retardation, autism, visual impairments, learning disabilities, attention deficit hyperactivity disorder, emotional disorders, medical issues, and general resources.

  11. Robust fuzzy control subject to state variance and passivity constraints for perturbed nonlinear systems with multiplicative noises.

    PubMed

    Chang, Wen-Jer; Huang, Bo-Jyun

    2014-11-01

    The multi-constrained robust fuzzy control problem is investigated in this paper for perturbed continuous-time nonlinear stochastic systems. The nonlinear system considered in this paper is represented by a Takagi-Sugeno fuzzy model with perturbations and state multiplicative noises. The multiple performance constraints considered in this paper include stability, passivity and individual state variance constraints. The Lyapunov stability theory is employed to derive sufficient conditions to achieve the above performance constraints. By solving these sufficient conditions, the contribution of this paper is to develop a parallel distributed compensation based robust fuzzy control approach to satisfy multiple performance constraints for perturbed nonlinear systems with multiplicative noises. At last, a numerical example for the control of perturbed inverted pendulum system is provided to illustrate the applicability and effectiveness of the proposed multi-constrained robust fuzzy control method. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  12. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H.; Montesinos-López, José C.; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-01-01

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. PMID:28364037

  13. A computer program to calculate the longitudinal aerodynamic characteristics of wing-flap configurations with externally blown flaps

    NASA Technical Reports Server (NTRS)

    Mendenhall, M. R.; Goodwin, F. K.; Spangler, S. B.

    1976-01-01

    A vortex lattice lifting-surface method is used to model the wing and multiple flaps. Each lifting surface may be of arbitrary planform having camber and twist, and the multiple-slotted trailing-edge flap system may consist of up to ten flaps with different spans and deflection angles. The engine wakes model consists of a series of closely spaced vortex rings with circular or elliptic cross sections. The rings are normal to a wake centerline which is free to move vertically and laterally to accommodate the local flow field beneath the wing and flaps. The two potential flow models are used in an iterative fashion to calculate the wing-flap loading distribution including the influence of the waves from up to two turbofan engines on the semispan. The method is limited to the condition where the flow and geometry of the configurations are symmetric about the vertical plane containing the wing root chord. The calculation procedure starts with arbitrarily positioned wake centerlines and the iterative calculation continues until the total configuration loading converges within a prescribed tolerance. Program results include total configuration forces and moments, individual lifting-surface load distributions, including pressure distributions, individual flap hinge moments, and flow field calculation at arbitrary field points.

  14. Assessing NARCCAP climate model effects using spatial confidence regions.

    PubMed

    French, Joshua P; McGinnis, Seth; Schwartzman, Armin

    2017-01-01

    We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP) climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

  15. Multiple-Parameter, Low-False-Alarm Fire-Detection Systems

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.; Greensburg, Paul; McKnight, Robert; Xu, Jennifer C.; Liu, C. C.; Dutta, Prabir; Makel, Darby; Blake, D.; Sue-Antillio, Jill

    2007-01-01

    Fire-detection systems incorporating multiple sensors that measure multiple parameters are being developed for use in storage depots, cargo bays of ships and aircraft, and other locations not amenable to frequent, direct visual inspection. These systems are intended to improve upon conventional smoke detectors, now used in such locations, that reliably detect fires but also frequently generate false alarms: for example, conventional smoke detectors based on the blockage of light by smoke particles are also affected by dust particles and water droplets and, thus, are often susceptible to false alarms. In contrast, by utilizing multiple parameters associated with fires, i.e. not only obscuration by smoke particles but also concentrations of multiple chemical species that are commonly generated in combustion, false alarms can be significantly decreased while still detecting fires as reliably as older smoke-detector systems do. The present development includes fabrication of sensors that have, variously, micrometer- or nanometer-sized features so that such multiple sensors can be integrated into arrays that have sizes, weights, and power demands smaller than those of older macroscopic sensors. The sensors include resistors, electrochemical cells, and Schottky diodes that exhibit different sensitivities to the various airborne chemicals of interest. In a system of this type, the sensor readings are digitized and processed by advanced signal-processing hardware and software to extract such chemical indications of fires as abnormally high concentrations of CO and CO2, possibly in combination with H2 and/or hydrocarbons. The system also includes a microelectromechanical systems (MEMS)-based particle detector and classifier device to increase the reliability of measurements of chemical species and particulates. In parallel research, software for modeling the evolution of a fire within an aircraft cargo bay has been developed. The model implemented in the software can describe the concentrations of chemical species and of particulate matter as functions of time. A system of the present developmental type and a conventional fire detector were tested under both fire and false-alarm conditions in a Federal Aviation Administration cargo-compartment- testing facility. Both systems consistently detected fires. However, the conventional fire detector consistently generated false alarms, whereas the developmental system did not generate any false alarms.

  16. Individual and social determinants of multiple chronic disease behavioral risk factors among youth.

    PubMed

    Alamian, Arsham; Paradis, Gilles

    2012-03-22

    Behavioral risk factors are known to co-occur among youth, and to increase risks of chronic diseases morbidity and mortality later in life. However, little is known about determinants of multiple chronic disease behavioral risk factors, particularly among youth. Previous studies have been cross-sectional and carried out without a sound theoretical framework. Using longitudinal data (n = 1135) from Cycle 4 (2000-2001), Cycle 5 (2002-2003) and Cycle 6 (2004-2005) of the National Longitudinal Survey of Children and Youth, a nationally representative sample of Canadian children who are followed biennially, the present study examines the influence of a set of conceptually-related individual/social distal variables (variables situated at an intermediate distance from behaviors), and individual/social ultimate variables (variables situated at an utmost distance from behaviors) on the rate of occurrence of multiple behavioral risk factors (physical inactivity, sedentary behavior, tobacco smoking, alcohol drinking, and high body mass index) in a sample of children aged 10-11 years at baseline. Multiple behavioral risk factors were assessed using a multiple risk factor score. All statistical analyses were performed using SAS, version 9.1, and SUDAAN, version 9.01. Multivariate longitudinal Poisson models showed that social distal variables including parental/peer smoking and peer drinking (Log-likelihood ratio (LLR) = 187.86, degrees of freedom (DF) = 8, p < .001), as well as individual distal variables including low self-esteem (LLR = 76.94, DF = 4, p < .001) increased the rate of occurrence of multiple behavioral risk factors. Individual ultimate variables including age, sex, and anxiety (LLR = 9.34, DF = 3, p < .05), as well as social ultimate variables including family socioeconomic status, and family structure (LLR = 10.93, DF = 5, p = .05) contributed minimally to the rate of co-occurrence of behavioral risk factors. The results suggest targeting individual/social distal variables in prevention programs of multiple chronic disease behavioral risk factors among youth.

  17. Organotypic three-dimensional culture model of mesenchymal and epithelial cells to examine tissue fusion events.

    EPA Science Inventory

    Tissue fusion during early mammalian development requires coordination of multiple cell types, the extracellular matrix, and complex signaling pathways. Fusion events during processes including heart development, neural tube closure, and palatal fusion are dependent on signaling ...

  18. STABLE ISOTOPES IN ECOLOGICAL STUDIES: NEW DEVELOPMENTS IN MIXING MODELS (URUGUAY)

    EPA Science Inventory

    Stable isotopes are increasingly being used as tracers in ecological studies. One application uses isotopic ratios to quantify the proportional contributions of multiple sources to a mixture. Examples include pollution sources for air or water bodies, food sources for animals, ...

  19. STABLE ISOTOPES IN ECOLOGICAL STUDIES: NEW DEVELOPMENTS IN MIXING MODELS (BRAZIL)

    EPA Science Inventory

    Stable isotopes are increasingly being used as tracers in ecological studies. One application uses isotopic ratios to quantify the proportional contributions of multiple sources to a mixture. Examples include pollution sources for air or water bodies, food sources for animals, ...

  20. 76 FR 55071 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-06

    ..., multiple sclerosis, rheumatoid arthritis, and lupus. Treatments generally include immunosuppressants or... anti-TL1A antibodies that prevent disease in mouse models of rheumatoid arthritis and inflammatory... Arthritis and Musculoskeletal and Skin Diseases is seeking statements of capability or interest from parties...

Top