Sample records for models provide additional

  1. Mathematical Model of Armed Helicopter vs Tank Duel

    DTIC Science & Technology

    The purpose of this thesis is to mathematically model a duel between the armed helicopter and the tank. In addition to providing a parametric...analysis of B. O. Koopman’s classical Detection-Destruction Duel , two additional models were constructed and analyzed. All three models stem from stochastic

  2. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  3. Local health department 2009 H1N1 influenza vaccination clinics-CDC staffing model comparison and other best practices.

    PubMed

    Porter, Dayna; Hall, Mark; Hartl, Brian; Raevsky, Cathy; Peacock, Roberta; Kraker, David; Walls, Sandra; Brink, Gail

    2011-01-01

    Mass vaccination clinic staffing models, such as the Centers for Disease Control and Prevention Large-Scale Vaccination Clinic Output and Staff Estimates: An Example, provide guidance on appropriate roles and number of staff for successful mass vaccination clinics within local and state health departments. The Kent County Health Department used this model as a starting point for mass vaccination clinics in response to 2009 H1N1 influenza. In addition to discussion of successful modification of the Centers for Disease Control and Prevention model to maximize local health department mass vaccination clinic efficiency, additional best practices including use of the Incident Command System and a reservation system are provided. Use of the provided modified staffing model and additional best practices will increase the success of health department mass vaccination clinics, and should be considered not only for future public health emergencies, but also for seasonal influenza vaccination campaigns.

  4. Light Z' in heterotic string standardlike models

    NASA Astrophysics Data System (ADS)

    Athanasopoulos, P.; Faraggi, A. E.; Mehta, V. M.

    2014-05-01

    The discovery of the Higgs boson at the LHC supports the hypothesis that the Standard Model provides an effective parametrization of all subatomic experimental data up to the Planck scale. String theory, which provides a viable perturbative approach to quantum gravity, requires for its consistency the existence of additional gauge symmetries beyond the Standard Model. The construction of heterotic string models with a viable light Z' is, however, highly constrained. We outline the construction of standardlike heterotic string models that allow for an additional Abelian gauge symmetry that may remain unbroken down to low scales. We present a string inspired model, consistent with the string constraints.

  5. Urine sampling and collection system optimization and testing

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Geating, J. A.; Koesterer, M. G.

    1975-01-01

    A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.

  6. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    The additivity model assumed that field-scale reaction properties in a sediment including surface area, reactive site concentration, and reaction rate can be predicted from field-scale grain-size distribution by linearly adding reaction properties estimated in laboratory for individual grain-size fractions. This study evaluated the additivity model in scaling mass transfer-limited, multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of the rate constants for individual grain-size fractions, which were then used to predict rate-limited U(VI) desorption in the composite sediment. The resultmore » indicated that the additivity model with respect to the rate of U(VI) desorption provided a good prediction of U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel-size fraction (2 to 8 mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  7. Evaluating models of remember-know judgments: complexity, mimicry, and discriminability.

    PubMed

    Cohen, Andrew L; Rotello, Caren M; Macmillan, Neil A

    2008-10-01

    Remember-know judgments provide additional information in recognition memory tests, but the nature of this information and the attendant decision process are in dispute. Competing models have proposed that remember judgments reflect a sum of familiarity and recollective information (the one-dimensional model), are based on a difference between these strengths (STREAK), or are purely recollective (the dual-process model). A choice among these accounts is sometimes made by comparing the precision of their fits to data, but this strategy may be muddied by differences in model complexity: Some models that appear to provide good fits may simply be better able to mimic the data produced by other models. To evaluate this possibility, we simulated data with each of the models in each of three popular remember-know paradigms, then fit those data to each of the models. We found that the one-dimensional model is generally less complex than the others, but despite this handicap, it dominates the others as the best-fitting model. For both reasons, the one-dimensional model should be preferred. In addition, we found that some empirical paradigms are ill-suited for distinguishing among models. For example, data collected by soliciting remember/know/new judgments--that is, the trinary task--provide a particularly weak ground for distinguishing models. Additional tables and figures may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, at www.psychonomic.org/archive.

  8. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  9. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  10. Application of nonlinear adaptive motion washout to transport ground-handling simulation

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Martin, D. J., Jr.

    1983-01-01

    The application of a nonlinear coordinated adaptive motion washout to the transport ground-handling environment is documented. Additions to both the aircraft math model and the motion washout system are discussed. The additions to the simulated-aircraft math model provided improved modeling fidelity for braking and reverse-thrust application, and the additions to the motion-base washout system allowed transition from the desired flight parameters to the less restrictive ground parameters of the washout.

  11. Applying Emax model and bivariate thin plate splines to assess drug interactions

    PubMed Central

    Kong, Maiying; Lee, J. Jack

    2014-01-01

    We review the semiparametric approach previously proposed by Kong and Lee and extend it to a case in which the dose-effect curves follow the Emax model instead of the median effect equation. When the maximum effects for the investigated drugs are different, we provide a procedure to obtain the additive effect based on the Loewe additivity model. Then, we apply a bivariate thin plate spline approach to estimate the effect beyond additivity along with its 95% point-wise confidence interval as well as its 95% simultaneous confidence interval for any combination dose. Thus, synergy, additivity, and antagonism can be identified. The advantages of the method are that it provides an overall assessment of the combination effect on the entire two-dimensional dose space spanned by the experimental doses, and it enables us to identify complex patterns of drug interaction in combination studies. In addition, this approach is robust to outliers. To illustrate this procedure, we analyzed data from two case studies. PMID:20036878

  12. Applying Emax model and bivariate thin plate splines to assess drug interactions.

    PubMed

    Kong, Maiying; Lee, J Jack

    2010-01-01

    We review the semiparametric approach previously proposed by Kong and Lee and extend it to a case in which the dose-effect curves follow the Emax model instead of the median effect equation. When the maximum effects for the investigated drugs are different, we provide a procedure to obtain the additive effect based on the Loewe additivity model. Then, we apply a bivariate thin plate spline approach to estimate the effect beyond additivity along with its 95 per cent point-wise confidence interval as well as its 95 per cent simultaneous confidence interval for any combination dose. Thus, synergy, additivity, and antagonism can be identified. The advantages of the method are that it provides an overall assessment of the combination effect on the entire two-dimensional dose space spanned by the experimental doses, and it enables us to identify complex patterns of drug interaction in combination studies. In addition, this approach is robust to outliers. To illustrate this procedure, we analyzed data from two case studies.

  13. Data Association Algorithms for Tracking Satellites

    DTIC Science & Technology

    2013-03-27

    validation of the new tools. The description provided here includes the mathematical back ground and description of the models implemented, as well as a...simulation development. This work includes the addition of higher-fidelity models in CU-TurboProp and validation of the new tools. The description...ode45(), used in Ananke, and (3) provide the necessary inputs to the bidirectional reflectance distribution function ( BRDF ) model provided by Pacific

  14. Delay Tolerant Networking - Bundle Protocol Simulation

    NASA Technical Reports Server (NTRS)

    SeGui, John; Jenning, Esther

    2006-01-01

    In this paper, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the useof MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions.

  15. Modeling mixtures of thyroid gland function disruptors in a vertebrate alternative model, the zebrafish eleutheroembryo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thienpont, Benedicte; Barata, Carlos; Raldúa, Demetrio, E-mail: drpqam@cid.csic.es

    2013-06-01

    Maternal thyroxine (T4) plays an essential role in fetal brain development, and even mild and transitory deficits in free-T4 in pregnant women can produce irreversible neurological effects in their offspring. Women of childbearing age are daily exposed to mixtures of chemicals disrupting the thyroid gland function (TGFDs) through the diet, drinking water, air and pharmaceuticals, which has raised the highest concern for the potential additive or synergic effects on the development of mild hypothyroxinemia during early pregnancy. Recently we demonstrated that zebrafish eleutheroembryos provide a suitable alternative model for screening chemicals impairing the thyroid hormone synthesis. The present study usedmore » the intrafollicular T4-content (IT4C) of zebrafish eleutheroembryos as integrative endpoint for testing the hypotheses that the effect of mixtures of TGFDs with a similar mode of action [inhibition of thyroid peroxidase (TPO)] was well predicted by a concentration addition concept (CA) model, whereas the response addition concept (RA) model predicted better the effect of dissimilarly acting binary mixtures of TGFDs [TPO-inhibitors and sodium-iodide symporter (NIS)-inhibitors]. However, CA model provided better prediction of joint effects than RA in five out of the six tested mixtures. The exception being the mixture MMI (TPO-inhibitor)-KClO{sub 4} (NIS-inhibitor) dosed at a fixed ratio of EC{sub 10} that provided similar CA and RA predictions and hence it was difficult to get any conclusive result. There results support the phenomenological similarity criterion stating that the concept of concentration addition could be extended to mixture constituents having common apical endpoints or common adverse outcomes. - Highlights: • Potential synergic or additive effect of mixtures of chemicals on thyroid function. • Zebrafish as alternative model for testing the effect of mixtures of goitrogens. • Concentration addition seems to predict better the effect of mixtures of goitrogens.« less

  16. Additive mixed effect model for recurrent gap time data.

    PubMed

    Ding, Jieli; Sun, Liuquan

    2017-04-01

    Gap times between recurrent events are often of primary interest in medical and observational studies. The additive hazards model, focusing on risk differences rather than risk ratios, has been widely used in practice. However, the marginal additive hazards model does not take the dependence among gap times into account. In this paper, we propose an additive mixed effect model to analyze gap time data, and the proposed model includes a subject-specific random effect to account for the dependence among the gap times. Estimating equation approaches are developed for parameter estimation, and the asymptotic properties of the resulting estimators are established. In addition, some graphical and numerical procedures are presented for model checking. The finite sample behavior of the proposed methods is evaluated through simulation studies, and an application to a data set from a clinic study on chronic granulomatous disease is provided.

  17. Developing a model for agile supply: an empirical study from Iranian pharmaceutical supply chain.

    PubMed

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API.

  18. Developing a Model for Agile Supply: an Empirical Study from Iranian Pharmaceutical Supply Chain

    PubMed Central

    Rajabzadeh Ghatari, Ali; Mehralian, Gholamhossein; Zarenezhad, Forouzandeh; Rasekh, Hamid Reza

    2013-01-01

    Agility is the fundamental characteristic of a supply chain needed for survival in turbulent markets, where environmental forces create additional uncertainty resulting in higher risk in the supply chain management. In addition, agility helps providing the right product, at the right time to the consumer. The main goal of this research is therefore to promote supplier selection in pharmaceutical industry according to the formative basic factors. Moreover, this paper can configure its supply network to achieve the agile supply chain. The present article analyzes the supply part of supply chain based on SCOR model, used to assess agile supply chains by highlighting their specific characteristics and applicability in providing the active pharmaceutical ingredient (API). This methodology provides an analytical modeling; the model enables potential suppliers to be assessed against the multiple criteria using both quantitative and qualitative measures. In addition, for making priority of critical factors, TOPSIS algorithm has been used as a common technique of MADM model. Finally, several factors such as delivery speed, planning and reorder segmentation, trust development and material quantity adjustment are identified and prioritized as critical factors for being agile in supply of API. PMID:24250689

  19. The value of compressed air energy storage in energy and reserve markets

    DOE PAGES

    Drury, Easan; Denholm, Paul; Sioshansi, Ramteen

    2011-06-28

    Storage devices can provide several grid services, however it is challenging to quantify the value of providing several services and to optimally allocate storage resources to maximize value. We develop a co-optimized Compressed Air Energy Storage (CAES) dispatch model to characterize the value of providing operating reserves in addition to energy arbitrage in several U.S. markets. We use the model to: (1) quantify the added value of providing operating reserves in addition to energy arbitrage; (2) evaluate the dynamic nature of optimally allocating storage resources into energy and reserve markets; and (3) quantify the sensitivity of CAES net revenues tomore » several design and performance parameters. We find that conventional CAES systems could earn an additional 23 ± 10/kW-yr by providing operating reserves, and adiabatic CAES systems could earn an additional 28 ± 13/kW-yr. We find that arbitrage-only revenues are unlikely to support a CAES investment in most market locations, but the addition of reserve revenues could support a conventional CAES investment in several markets. Adiabatic CAES revenues are not likely to support an investment in most regions studied. As a result, modifying CAES design and performance parameters primarily impacts arbitrage revenues, and optimizing CAES design will be nearly independent of dispatch strategy.« less

  20. An extended supersonic combustion model for the dynamic analysis of hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Bossard, J. A.; Peck, R. E.; Schmidt, D. K.

    1993-01-01

    The development of an advanced dynamic model for aeroelastic hypersonic vehicles powered by air breathing engines requires an adequate engine model. This report provides a discussion of some of the more important features of supersonic combustion and their relevance to the analysis and design of supersonic ramjet engines. Of particular interest are those aspects of combustion that impact the control of the process. Furthermore, the report summarizes efforts to enhance the aeropropulsive/aeroelastic dynamic model developed at the Aerospace Research Center of Arizona State University by focusing on combustion and improved modeling of this flow. The expanded supersonic combustor model described here has the capability to model the effects of friction, area change, and mass addition, in addition to the heat addition process. A comparison is made of the results from four cases: (1) heat addition only; (2) heat addition plus friction; (3) heat addition, friction, and area reduction, and (4) heat addition, friction, area reduction, and mass addition. The relative impact of these effects on the Mach number, static temperature, and static pressure distributions within the combustor are then shown. Finally, the effects of frozen versus equilibrium flow conditions within the exhaust plume is discussed.

  1. Removal of phosphate from greenhouse wastewater using hydrated lime.

    PubMed

    Dunets, C Siobhan; Zheng, Youbin

    2014-01-01

    Phosphate (P) contamination in nutrient-laden wastewater is currently a major topic of discussion in the North American greenhouse industry. Precipitation of P as calcium phosphate minerals using hydrated lime could provide a simple, inexpensive method for retrieval. A combination of batch experiments and chemical equilibrium modelling was used to confirm the viability of this P removal method and determine lime addition rates and pH requirements for greenhouse wastewater of varying nutrient compositions. Lime: P ratio (molar ratio of CaMg(OH)₄: PO₄‒P) provided a consistent parameter for estimating lime addition requirements regardless of initial P concentration, with a ratio of 1.5 providing around 99% removal of dissolved P. Optimal P removal occurred when lime addition increased the pH from 8.6 to 9.0, suggesting that pH monitoring during the P removal process could provide a simple method for ensuring consistent adherence to P removal standards. A Visual MINTEQ model, validated using experimental data, provided a means of predicting lime addition and pH requirements as influenced by changes in other parameters of the lime-wastewater system (e.g. calcium concentration, temperature, and initial wastewater pH). Hydrated lime addition did not contribute to the removal of macronutrient elements such as nitrate and ammonium, but did decrease the concentration of some micronutrients. This study provides basic guidance for greenhouse operators to use hydrated lime for phosphate removal from greenhouse wastewater.

  2. General personality and psychopathology in referred and nonreferred children and adolescents: an investigation of continuity, pathoplasty, and complication models.

    PubMed

    De Bolle, Marleen; Beyers, Wim; De Clercq, Barbara; De Fruyt, Filip

    2012-11-01

    This study investigated the continuity, pathoplasty, and complication models as plausible explanations for personality-psychopathology relations in a combined sample of community (n = 571) and referred (n = 146) children and adolescents. Multivariate structural equation modeling was used to examine the structural relations between latent personality and psychopathology change across a 2-year period. Item response theory models were fitted as an additional test of the continuity hypothesis. Even after correcting for item overlap, the results provided strong support for the continuity model, demonstrating that personality and psychopathology displayed dynamic change patterns across time. Item response theory models further supported the continuity conceptualization for understanding the association between internalizing problems and emotional stability and extraversion as well as between externalizing problems and benevolence and conscientiousness. In addition to the continuity model, particular personality and psychopathology combinations provided evidence for the pathoplasty and complication models. The theoretical and practical implications of these results are discussed, and suggestions for future research are provided. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  3. Multilevel Modeling and School Psychology: A Review and Practical Example

    ERIC Educational Resources Information Center

    Graves, Scott L., Jr.; Frohwerk, April

    2009-01-01

    The purpose of this article is to provide an overview of the state of multilevel modeling in the field of school psychology. The authors provide a systematic assessment of published research of multilevel modeling studies in 5 journals devoted to the research and practice of school psychology. In addition, a practical example from the nationally…

  4. Sensitivity of Value Added School Effect Estimates to Different Model Specifications and Outcome Measures

    ERIC Educational Resources Information Center

    Pride, Bryce L.

    2012-01-01

    The Adequate Yearly Progress (AYP) Model has been used to make many high-stakes decisions concerning schools, though it does not provide a complete assessment of student academic achievement and school effectiveness. To provide a clearer perspective, many states have implemented various Growth and Value Added Models, in addition to AYP. The…

  5. AMEM-ADL Polymer Migration Estimation Model User's Guide

    EPA Pesticide Factsheets

    The user's guide of the Arthur D. Little Polymer Migration Estimation Model (AMEM) provides the information on how the model estimates the fraction of a chemical additive that diffuses through polymeric matrices.

  6. SBML Level 3 package: Hierarchical Model Composition, Version 1 Release 3

    PubMed Central

    Smith, Lucian P.; Hucka, Michael; Hoops, Stefan; Finney, Andrew; Ginkel, Martin; Myers, Chris J.; Moraru, Ion; Liebermeister, Wolfram

    2017-01-01

    Summary Constructing a model in a hierarchical fashion is a natural approach to managing model complexity, and offers additional opportunities such as the potential to re-use model components. The SBML Level 3 Version 1 Core specification does not directly provide a mechanism for defining hierarchical models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Hierarchical Model Composition package for SBML Level 3 adds the necessary features to SBML to support hierarchical modeling. The package enables a modeler to include submodels within an enclosing SBML model, delete unneeded or redundant elements of that submodel, replace elements of that submodel with element of the containing model, and replace elements of the containing model with elements of the submodel. In addition, the package defines an optional “port” construct, allowing a model to be defined with suggested interfaces between hierarchical components; modelers can chose to use these interfaces, but they are not required to do so and can still interact directly with model elements if they so chose. Finally, the SBML Hierarchical Model Composition package is defined in such a way that a hierarchical model can be “flattened” to an equivalent, non-hierarchical version that uses only plain SBML constructs, thus enabling software tools that do not yet support hierarchy to nevertheless work with SBML hierarchical models. PMID:26528566

  7. VISUAL PLUMES MIXING ZONE MODELING SOFTWARE

    EPA Science Inventory

    The US Environmental Protection Agency has a history of developing plume models and providing technical assistance. The Visual Plumes model (VP) is a recent addition to the public-domain models available on the EPA Center for Exposure Assessment Modeling (CEAM) web page. The Wind...

  8. VISUAL PLUMES MIXING ZONE MODELING SOFTWARE

    EPA Science Inventory

    The U.S. Environmental Protection Agency has a long history of both supporting plume model development and providing mixing zone modeling software. The Visual Plumes model is the most recent addition to the suite of public-domain models available through the EPA-Athens Center f...

  9. Simplified cost models for prefeasibility mineral evaluations

    USGS Publications Warehouse

    Camm, Thomas W.

    1991-01-01

    This report contains 2 open pit models, 6 underground mine models, 11 mill models, and cost equations for access roads, power lines, and tailings ponds. In addition, adjustment factors for variation in haulage distances are provided for open pit models and variation in mining depths for underground models.

  10. System for the Analysis of Global Energy Markets - Vol. II, Model Documentation

    EIA Publications

    2003-01-01

    The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.

  11. Metal mixture modeling evaluation project: 2. Comparison of four modeling approaches.

    PubMed

    Farley, Kevin J; Meyer, Joseph S; Balistrieri, Laurie S; De Schamphelaere, Karel A C; Iwasaki, Yuichi; Janssen, Colin R; Kamo, Masashi; Lofts, Stephen; Mebane, Christopher A; Naito, Wataru; Ryan, Adam C; Santore, Robert C; Tipping, Edward

    2015-04-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the US Geological Survey (USA), HDR|HydroQual (USA), and the Centre for Ecology and Hydrology (United Kingdom) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME workshop in Brussels, Belgium (May 2012), is provided in the present study. Overall, the models were found to be similar in structure (free ion activities computed by the Windermere humic aqueous model [WHAM]; specific or nonspecific binding of metals/cations in or on the organism; specification of metal potency factors or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single vs multiple types of binding sites on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong interrelationships among the model parameters (binding constants, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed. © 2014 SETAC.

  12. Electrostatic Levitation for Studies of Additive Manufactured Materials

    NASA Technical Reports Server (NTRS)

    SanSoucie, Michael P.; Rogers, Jan R.; Tramel, Terri

    2014-01-01

    The electrostatic levitation (ESL) laboratory at NASA's Marshall Space Flight Center is a unique facility for investigators studying high temperature materials. The laboratory boasts two levitators in which samples can be levitated, heated, melted, undercooled, and resolidified. Electrostatic levitation minimizes gravitational effects and allows materials to be studied without contact with a container or instrumentation. The lab also has a high temperature emissivity measurement system, which provides normal spectral and normal total emissivity measurements at use temperature. The ESL lab has been instrumental in many pioneering materials investigations of thermophysical properties, e.g., creep measurements, solidification, triggered nucleation, and emissivity at high temperatures. Research in the ESL lab has already led to the development of advanced high temperature materials for aerospace applications, coatings for rocket nozzles, improved medical and industrial optics, metallic glasses, ablatives for reentry vehicles, and materials with memory. Modeling of additive manufacturing materials processing is necessary for the study of their resulting materials properties. In addition, the modeling of the selective laser melting processes and its materials property predictions are also underway. Unfortunately, there is very little data for the properties of these materials, especially of the materials in the liquid state. Some method to measure thermophysical properties of additive manufacturing materials is necessary. The ESL lab is ideal for these studies. The lab can provide surface tension and viscosity of molten materials, density measurements, emissivity measurements, and even creep strength measurements. The ESL lab can also determine melting temperature, surface temperatures, and phase transition temperatures of additive manufactured materials. This presentation will provide background on the ESL lab and its capabilities, provide an approach to using the ESL in supporting the development and modeling of the selective laser melting process for metals, and provide an overview of the results to date.

  13. Multi-allelic haplotype model based on genetic partition for genomic prediction and variance component estimation using SNP markers.

    PubMed

    Da, Yang

    2015-12-18

    The amount of functional genomic information has been growing rapidly but remains largely unused in genomic selection. Genomic prediction and estimation using haplotypes in genome regions with functional elements such as all genes of the genome can be an approach to integrate functional and structural genomic information for genomic selection. Towards this goal, this article develops a new haplotype approach for genomic prediction and estimation. A multi-allelic haplotype model treating each haplotype as an 'allele' was developed for genomic prediction and estimation based on the partition of a multi-allelic genotypic value into additive and dominance values. Each additive value is expressed as a function of h - 1 additive effects, where h = number of alleles or haplotypes, and each dominance value is expressed as a function of h(h - 1)/2 dominance effects. For a sample of q individuals, the limit number of effects is 2q - 1 for additive effects and is the number of heterozygous genotypes for dominance effects. Additive values are factorized as a product between the additive model matrix and the h - 1 additive effects, and dominance values are factorized as a product between the dominance model matrix and the h(h - 1)/2 dominance effects. Genomic additive relationship matrix is defined as a function of the haplotype model matrix for additive effects, and genomic dominance relationship matrix is defined as a function of the haplotype model matrix for dominance effects. Based on these results, a mixed model implementation for genomic prediction and variance component estimation that jointly use haplotypes and single markers is established, including two computing strategies for genomic prediction and variance component estimation with identical results. The multi-allelic genetic partition fills a theoretical gap in genetic partition by providing general formulations for partitioning multi-allelic genotypic values and provides a haplotype method based on the quantitative genetics model towards the utilization of functional and structural genomic information for genomic prediction and estimation.

  14. Designing multifocal corneal models to correct presbyopia by laser ablation

    NASA Astrophysics Data System (ADS)

    Alarcón, Aixa; Anera, Rosario G.; Del Barco, Luis Jiménez; Jiménez, José R.

    2012-01-01

    Two multifocal corneal models and an aspheric model designed to correct presbyopia by corneal photoablation were evaluated. The design of each model was optimized to achieve the best visual quality possible for both near and distance vision. In addition, we evaluated the effect of myosis and pupil decentration on visual quality. The corrected model with the central zone for near vision provides better results since it requires less ablated corneal surface area, permits higher addition values, presents stabler visual quality with pupil-size variations and lower high-order aberrations.

  15. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    USGS Publications Warehouse

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  16. The Relationship of Repeated Technical Assistance Support Visits to the Delivery of Positive Health, Dignity, and Prevention (PHDP) Messages by Healthcare Providers in Mozambique: A Longitudinal Multilevel Analysis.

    PubMed

    Gutin, Sarah A; Amico, K Rivet; Hunguana, Elsa; Munguambe, António Orlando; Rose, Carol Dawson

    Positive health, dignity, and prevention (PHDP) is Mozambique's strategy to engage clinicians in the delivery of prevention messages to their HIV-positive clients. This national implementation strategy uses provider trainings on offering key messages and focuses on intervening on 9 evidence-based risk reduction areas. We investigated the impact of longitudinal technical assistance (TA) as an addition to this basic training. We followed 153 healthcare providers in 5 Mozambican provinces over 6 months to evaluate the impact of on-site, observation-based TA on PHDP implementation. Longitudinal multilevel models were estimated to model change in PHDP message delivery over time among individual providers. With each additional TA visit, providers delivered about 1 additional PHDP message ( P < .001); clinicians and nonclinicians started at about the same baseline level, but clinicians improved more quickly ( P = .004). Message delivery varied by practice sector; maternal and child health sectors outperformed other sectors. Longitudinal TA helped reach the programmatic goals of the PHDP program in Mozambique.

  17. Modelling of individual subject ozone exposure response kinetics.

    PubMed

    Schelegle, Edward S; Adams, William C; Walby, William F; Marion, M Susan

    2012-06-01

    A better understanding of individual subject ozone (O(3)) exposure response kinetics will provide insight into how to improve models used in the risk assessment of ambient ozone exposure. To develop a simple two compartment exposure-response model that describes individual subject decrements in forced expiratory volume in one second (FEV(1)) induced by the acute inhalation of O(3) lasting up to 8 h. FEV(1) measurements of 220 subjects who participated in 14 previously completed studies were fit to the model using both particle swarm and nonlinear least squares optimization techniques to identify three subject-specific coefficients producing minimum "global" and local errors, respectively. Observed and predicted decrements in FEV(1) of the 220 subjects were used for validation of the model. Further validation was provided by comparing the observed O(3)-induced FEV(1) decrements in an additional eight studies with predicted values obtained using model coefficients estimated from the 220 subjects used in cross validation. Overall the individual subject measured and modeled FEV(1) decrements were highly correlated (mean R(2) of 0.69 ± 0.24). In addition, it was shown that a matrix of individual subject model coefficients can be used to predict the mean and variance of group decrements in FEV(1). This modeling approach provides insight into individual subject O(3) exposure response kinetics and provides a potential starting point for improving the risk assessment of environmental O(3) exposure.

  18. Pedestrian mobile mapping system for indoor environments based on MEMS IMU and range camera

    NASA Astrophysics Data System (ADS)

    Haala, N.; Fritsch, D.; Peter, M.; Khosravani, A. M.

    2011-12-01

    This paper describes an approach for the modeling of building interiors based on a mobile device, which integrates modules for pedestrian navigation and low-cost 3D data collection. Personal navigation is realized by a foot mounted low cost MEMS IMU, while 3D data capture for subsequent indoor modeling uses a low cost range camera, which was originally developed for gaming applications. Both steps, navigation and modeling, are supported by additional information as provided from the automatic interpretation of evacuation plans. Such emergency plans are compulsory for public buildings in a number of countries. They consist of an approximate floor plan, the current position and escape routes. Additionally, semantic information like stairs, elevators or the floor number is available. After the user has captured an image of such a floor plan, this information is made explicit again by an automatic raster-to-vector-conversion. The resulting coarse indoor model then provides constraints at stairs or building walls, which restrict the potential movement of the user. This information is then used to support pedestrian navigation by eliminating drift effects of the used low-cost sensor system. The approximate indoor building model additionally provides a priori information during subsequent indoor modeling. Within this process, the low cost range camera Kinect is used for the collection of multiple 3D point clouds, which are aligned by a suitable matching step and then further analyzed to refine the coarse building model.

  19. Using generalized additive (mixed) models to analyze single case designs.

    PubMed

    Shadish, William R; Zuur, Alain F; Sullivan, Kristynn J

    2014-04-01

    This article shows how to apply generalized additive models and generalized additive mixed models to single-case design data. These models excel at detecting the functional form between two variables (often called trend), that is, whether trend exists, and if it does, what its shape is (e.g., linear and nonlinear). In many respects, however, these models are also an ideal vehicle for analyzing single-case designs because they can consider level, trend, variability, overlap, immediacy of effect, and phase consistency that single-case design researchers examine when interpreting a functional relation. We show how these models can be implemented in a wide variety of ways to test whether treatment is effective, whether cases differ from each other, whether treatment effects vary over cases, and whether trend varies over cases. We illustrate diagnostic statistics and graphs, and we discuss overdispersion of data in detail, with examples of quasibinomial models for overdispersed data, including how to compute dispersion and quasi-AIC fit indices in generalized additive models. We show how generalized additive mixed models can be used to estimate autoregressive models and random effects and discuss the limitations of the mixed models compared to generalized additive models. We provide extensive annotated syntax for doing all these analyses in the free computer program R. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  20. Atmospheric radiation modeling of galactic cosmic rays using LRO/CRaTER and the EMMREM model with comparisons to balloon and airline based measurements

    NASA Astrophysics Data System (ADS)

    Joyce, C. J.; Schwadron, N. A.; Townsend, L. W.; deWet, W. C.; Wilson, J. K.; Spence, H. E.; Tobiska, W. K.; Shelton-Mur, K.; Yarborough, A.; Harvey, J.; Herbst, A.; Koske-Phillips, A.; Molina, F.; Omondi, S.; Reid, C.; Reid, D.; Shultz, J.; Stephenson, B.; McDevitt, M.; Phillips, T.

    2016-09-01

    We provide an analysis of the galactic cosmic ray radiation environment of Earth's atmosphere using measurements from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) aboard the Lunar Reconnaissance Orbiter (LRO) together with the Badhwar-O'Neil model and dose lookup tables generated by the Earth-Moon-Mars Radiation Environment Module (EMMREM). This study demonstrates an updated atmospheric radiation model that uses new dose tables to improve the accuracy of the modeled dose rates. Additionally, a method for computing geomagnetic cutoffs is incorporated into the model in order to account for location-dependent effects of the magnetosphere. Newly available measurements of atmospheric dose rates from instruments aboard commercial aircraft and high-altitude balloons enable us to evaluate the accuracy of the model in computing atmospheric dose rates. When compared to the available observations, the model seems to be reasonably accurate in modeling atmospheric radiation levels, overestimating airline dose rates by an average of 20%, which falls within the uncertainty limit recommended by the International Commission on Radiation Units and Measurements (ICRU). Additionally, measurements made aboard high-altitude balloons during simultaneous launches from New Hampshire and California provide an additional comparison to the model. We also find that the newly incorporated geomagnetic cutoff method enables the model to represent radiation variability as a function of location with sufficient accuracy.

  1. Heat sink effects in VPPA welding

    NASA Technical Reports Server (NTRS)

    Steranka, Paul O., Jr.

    1990-01-01

    The development of a model for prediction of heat sink effects associated with the Variable Polarity Plasma Arc (VPPA) Welding Process is discussed. The long term goal of this modeling is to provide means for assessing potential heat sink effects and, eventually, to provide indications as to changes in the welding process that could be used to compensate for these effects and maintain the desired weld quality. In addition to the development of a theoretical model, a brief experimental investigation was conducted to demonstrate heat sink effects and to provide an indication of the accuracy of the model.

  2. EIA model documentation: Petroleum Market Model of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-12-30

    The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models (Public Law 94-385, section 57.b.2). The PMM models petroleum refining activities, the marketing of products, the production of natural gas liquids and domestic methanol, projects petroleum provides and sources of supplies for meeting demand. In addition, the PMMmore » estimates domestic refinery capacity expansion and fuel consumption.« less

  3. Modeling Off-Nominal Behavior in SysML

    NASA Technical Reports Server (NTRS)

    Day, John; Donahue, Kenny; Ingham, Mitch; Kadesch, Alex; Kennedy, Kit; Post, Ethan

    2012-01-01

    Fault Management is an essential part of the system engineering process that is limited in its effectiveness by the ad hoc nature of the applied approaches and methods. Providing a rigorous way to develop and describe off-nominal behavior is a necessary step in the improvement of fault management, and as a result, will enable safe, reliable and available systems even as system complexity increases... The basic concepts described in this paper provide a foundation to build a larger set of necessary concepts and relationships for precise modeling of off-nominal behavior, and a basis for incorporating these ideas into the overall systems engineering process.. The simple FMEA example provided applies the modeling patterns we have developed and illustrates how the information in the model can be used to reason about the system and derive typical fault management artifacts.. A key insight from the FMEA work was the utility of defining failure modes as the "inverse of intent", and deriving this from the behavior models.. Additional work is planned to extend these ideas and capabilities to other types of relevant information and additional products.

  4. Empowering Adolescent Survivors of Sexual Abuse: Application of a Solution-Focused Ericksonian Counseling Group

    ERIC Educational Resources Information Center

    Kress, Victoria E.; Hoffman, Rachel M.

    2008-01-01

    This article describes a solution-focused and Ericksonian group counseling model that can be used with adolescent girls who have been sexually abused. An overview of the components of this approach is provided. A postintervention focus group provided additional results and ideas for the future development of the group counseling model.

  5. Guide to Parent Involvement: Parents as Adult Learners. The Family Academy Model of the Family as Educator.

    ERIC Educational Resources Information Center

    American Univ., Washington, DC. Adult Learning Potential Inst.

    This document is the second of a series of four reports developed to provide a comprehensive overview of parent involvement, encompassing the family, parenting needs, and existing resources, in addition to current parent education approaches and practices. This "Family Academy Model" provides one interpretation of how the family functions as…

  6. Faculty Salary Equity: Issues in Regression Model Selection. AIR 1992 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Moore, Nelle

    This paper discusses the determination of college faculty salary inequity and identifies the areas in which human judgment must be used in order to conduct a statistical analysis of salary equity. In addition, it provides some informed guidelines for making those judgments. The paper provides a framework for selecting salary equity models, based…

  7. Modeling water quality effects of structural and operational changes to Scoggins Dam and Henry Hagg Lake, Oregon

    USGS Publications Warehouse

    Sullivan, Annett B.; Rounds, Stewart A.

    2006-01-01

    To meet water quality targets and the municipal and industrial water needs of a growing population in the Tualatin River Basin in northwestern Oregon, an expansion of Henry Hagg Lake is under consideration. Hagg Lake is the basin's primary storage reservoir and provides water during western Oregon's typically dry summers. Potential modifications include raising the dam height by 6.1 meters (20 feet), 7.6 meters (25 feet), or 12.2 meters (40 feet); installing additional outlets (possibly including a selective withdrawal tower); and adding additional inflows to provide greater reliability of filling the enlarged reservoir. One method of providing additional inflows is to route water from the upper Tualatin River through a tunnel and into Sain Creek, a tributary to the lake. Another option is to pump water from the Tualatin River (downstream of the lake) uphill and into the reservoir during the winter--the 'pump-back' option. A calibrated CE-QUAL-W2 model of Henry Hagg Lake's hydrodynamics, temperature, and water quality was used to examine the effect of these proposed changes on water quality in the lake and downstream. Most model scenarios were run with the calibrated model for 2002, a typical water year; a few scenarios were run for 2001, a drought year. More...

  8. Higher Education: New Models, New Rules

    ERIC Educational Resources Information Center

    Soares, Louis; Eaton, Judith S.; Smith, Burck

    2013-01-01

    The Internet enables new models. In the commercial world, for example, we have eBay, Amazon.com, and Netflix. These new models operate with a different set of rules than do traditional models. New models are emerging in higher education as well--for example, competency-based programs. In addition, courses that are being provided from outside the…

  9. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  10. An improved null model for assessing the net effects of multiple stressors on communities.

    PubMed

    Thompson, Patrick L; MacLennan, Megan M; Vinebrooke, Rolf D

    2018-01-01

    Ecological stressors (i.e., environmental factors outside their normal range of variation) can mediate each other through their interactions, leading to unexpected combined effects on communities. Determining whether the net effect of stressors is ecologically surprising requires comparing their cumulative impact to a null model that represents the linear combination of their individual effects (i.e., an additive expectation). However, we show that standard additive and multiplicative null models that base their predictions on the effects of single stressors on community properties (e.g., species richness or biomass) do not provide this linear expectation, leading to incorrect interpretations of antagonistic and synergistic responses by communities. We present an alternative, the compositional null model, which instead bases its predictions on the effects of stressors on individual species, and then aggregates them to the community level. Simulations demonstrate the improved ability of the compositional null model to accurately provide a linear expectation of the net effect of stressors. We simulate the response of communities to paired stressors that affect species in a purely additive fashion and compare the relative abilities of the compositional null model and two standard community property null models (additive and multiplicative) to predict these linear changes in species richness and community biomass across different combinations (both positive, negative, or opposite) and intensities of stressors. The compositional model predicts the linear effects of multiple stressors under almost all scenarios, allowing for proper classification of net effects, whereas the standard null models do not. Our findings suggest that current estimates of the prevalence of ecological surprises on communities based on community property null models are unreliable, and should be improved by integrating the responses of individual species to the community level as does our compositional null model. © 2017 John Wiley & Sons Ltd.

  11. Markov Decision Process Measurement Model.

    PubMed

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  12. Hydroxylamine addition impact to Nitrosomonas europaea activity in the presence of monochloramine.

    PubMed

    Wahman, David G; Speitel, Gerald E

    2015-01-01

    In drinking water, monochloramine may promote ammonia–oxidizing bacteria (AOB) growth because of concurrent ammonia presence. AOB use (i) ammonia monooxygenase for biological ammonia oxidation to hydroxylamine and (ii) hydroxylamine oxidoreductase for biological hydroxylamine oxidation to nitrite. In addition, monochloramine and hydroxylamine abiotically react, providing AOB a potential benefit by removing the disinfectant (monochloramine) and releasing growth substrate (ammonia). Alternatively and because biological hydroxylamine oxidation supplies the electrons (reductant) required for biological ammonia oxidation, the monochloramine/hydroxylamine abiotic reaction represents a possible inactivation mechanism by consuming hydroxylamine and inhibiting reductant generation. To investigate the abiotic monochloramine and hydroxylamine reaction's impact on AOB activity, the current study used batch experiments with Nitrosomonas europaea (AOB pure culture), ammonia, monochloramine, and hydroxylamine addition. To decipher whether hydroxylamine addition benefitted N. europaea activity by (i) removing monochloramine and releasing free ammonia or (ii) providing an additional effect (possibly the aforementioned reductant source), a previously developed cometabolism model was coupled with an abiotic monochloramine and hydroxylamine model for data interpretation. N. europaea maintained ammonia oxidizing activity when hydroxylamine was added before complete ammonia oxidation cessation. The impact could not be accounted for by monochloramine removal and free ammonia release alone and was concentration dependent for both monochloramine and hydroxylamine. In addition, a preferential negative impact occurred for ammonia versus hydroxylamine oxidation. These results suggest an additional benefit of exogenous hydroxylamine addition beyond monochloramine removal and free ammonia release, possibly providing reductant generation.

  13. Improvements to constitutive material model for fabrics

    NASA Astrophysics Data System (ADS)

    Morea, Mihai I.

    2011-12-01

    The high strength to weight ratio of woven fabric offers a cost effective solution to be used in a containment system for aircraft propulsion engines. Currently, Kevlar is the only Federal Aviation Administration (FAA) approved fabric for usage in systems intended to mitigate fan blade-out events. This research builds on an earlier constitutive model of Kevlar 49 fabric developed at Arizona State University (ASU) with the addition of new and improved modeling details. Latest stress strain experiments provided new and valuable data used to modify the material model post peak behavior. These changes reveal an overall improvement of the Finite Element (FE) model's ability to predict experimental results. First, the steel projectile is modeled using Johnson-Cook material model and provides a more realistic behavior in the FE ballistic models. This is particularly noticeable when comparing FE models with laboratory tests where large deformations in projectiles are observed. Second, follow-up analysis of the results obtained through the new picture frame tests conducted at ASU provides new values for the shear moduli and corresponding strains. The new approach for analysis of data from picture frame tests combines digital image analysis and a two-level factorial optimization formulation. Finally, an additional improvement in the material model for Kevlar involves checking the convergence at variation of mesh density of fabrics. The study performed and described herein shows the converging trend, therefore validating the FE model.

  14. A Single-Boundary Accumulator Model of Response Times in an Addition Verification Task

    PubMed Central

    Faulkenberry, Thomas J.

    2017-01-01

    Current theories of mathematical cognition offer competing accounts of the interplay between encoding and calculation in mental arithmetic. Additive models propose that manipulations of problem format do not interact with the cognitive processes used in calculation. Alternatively, interactive models suppose that format manipulations have a direct effect on calculation processes. In the present study, we tested these competing models by fitting participants' RT distributions in an arithmetic verification task with a single-boundary accumulator model (the shifted Wald distribution). We found that in addition to providing a more complete description of RT distributions, the accumulator model afforded a potentially more sensitive test of format effects. Specifically, we found that format affected drift rate, which implies that problem format has a direct impact on calculation processes. These data give further support for an interactive model of mental arithmetic. PMID:28769853

  15. Development of unconfined conditions in multi-aquifer flow systems: a case study in the Rajshahi Barind, Bangladesh

    NASA Astrophysics Data System (ADS)

    Rushton, K. R.; Zaman, M. Asaduz

    2017-01-01

    Identifying flow processes in multi-aquifer flow systems is a considerable challenge, especially if substantial abstraction occurs. The Rajshahi Barind groundwater flow system in Bangladesh provides an example of the manner in which flow processes can change with time. At some locations there has been a decrease with time in groundwater heads and also in the magnitude of the seasonal fluctuations. This report describes the important stages in a detailed field and modelling study at a specific location in this groundwater flow system. To understand more about the changing conditions, piezometers were constructed in 2015 at different depths but the same location; water levels in these piezometers indicate the formation of an additional water table. Conceptual models are described which show how conditions have changed between the years 2000 and 2015. Following the formation of the additional water table, the aquifer system is conceptualised as two units. A pumping test is described with data collected during both the pumping and recovery phases. Pumping test data for the Lower Unit are analysed using a computational model with estimates of the aquifer parameters; the model also provided estimates of the quantity of water moving from the ground surface, through the Upper Unit, to provide an input to the Lower Unit. The reasons for the substantial changes in the groundwater heads are identified; monitoring of the recently formed additional water table provides a means of testing whether over-abstraction is occurring.

  16. A green vehicle routing problem with customer satisfaction criteria

    NASA Astrophysics Data System (ADS)

    Afshar-Bakeshloo, M.; Mehrabi, A.; Safari, H.; Maleki, M.; Jolai, F.

    2016-12-01

    This paper develops an MILP model, named Satisfactory-Green Vehicle Routing Problem. It consists of routing a heterogeneous fleet of vehicles in order to serve a set of customers within predefined time windows. In this model in addition to the traditional objective of the VRP, both the pollution and customers' satisfaction have been taken into account. Meanwhile, the introduced model prepares an effective dashboard for decision-makers that determines appropriate routes, the best mixed fleet, speed and idle time of vehicles. Additionally, some new factors evaluate the greening of each decision based on three criteria. This model applies piecewise linear functions (PLFs) to linearize a nonlinear fuzzy interval for incorporating customers' satisfaction into other linear objectives. We have presented a mixed integer linear programming formulation for the S-GVRP. This model enriches managerial insights by providing trade-offs between customers' satisfaction, total costs and emission levels. Finally, we have provided a numerical study for showing the applicability of the model.

  17. Advances in cognitive theory and therapy: the generic cognitive model.

    PubMed

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  18. Random regression models using Legendre polynomials or linear splines for test-day milk yield of dairy Gyr (Bos indicus) cattle.

    PubMed

    Pereira, R J; Bignardi, A B; El Faro, L; Verneque, R S; Vercesi Filho, A E; Albuquerque, L G

    2013-01-01

    Studies investigating the use of random regression models for genetic evaluation of milk production in Zebu cattle are scarce. In this study, 59,744 test-day milk yield records from 7,810 first lactations of purebred dairy Gyr (Bos indicus) and crossbred (dairy Gyr × Holstein) cows were used to compare random regression models in which additive genetic and permanent environmental effects were modeled using orthogonal Legendre polynomials or linear spline functions. Residual variances were modeled considering 1, 5, or 10 classes of days in milk. Five classes fitted the changes in residual variances over the lactation adequately and were used for model comparison. The model that fitted linear spline functions with 6 knots provided the lowest sum of residual variances across lactation. On the other hand, according to the deviance information criterion (DIC) and bayesian information criterion (BIC), a model using third-order and fourth-order Legendre polynomials for additive genetic and permanent environmental effects, respectively, provided the best fit. However, the high rank correlation (0.998) between this model and that applying third-order Legendre polynomials for additive genetic and permanent environmental effects, indicates that, in practice, the same bulls would be selected by both models. The last model, which is less parameterized, is a parsimonious option for fitting dairy Gyr breed test-day milk yield records. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. Behavioral and locomotor measurements using an open field activity monitoring system for skeletal muscle diseases.

    PubMed

    Tatem, Kathleen S; Quinn, James L; Phadke, Aditi; Yu, Qing; Gordish-Dressman, Heather; Nagaraju, Kanneboyina

    2014-09-29

    The open field activity monitoring system comprehensively assesses locomotor and behavioral activity levels of mice. It is a useful tool for assessing locomotive impairment in animal models of neuromuscular disease and efficacy of therapeutic drugs that may improve locomotion and/or muscle function. The open field activity measurement provides a different measure than muscle strength, which is commonly assessed by grip strength measurements. It can also show how drugs may affect other body systems as well when used with additional outcome measures. In addition, measures such as total distance traveled mirror the 6 min walk test, a clinical trial outcome measure. However, open field activity monitoring is also associated with significant challenges: Open field activity measurements vary according to animal strain, age, sex, and circadian rhythm. In addition, room temperature, humidity, lighting, noise, and even odor can affect assessment outcomes. Overall, this manuscript provides a well-tested and standardized open field activity SOP for preclinical trials in animal models of neuromuscular diseases. We provide a discussion of important considerations, typical results, data analysis, and detail the strengths and weaknesses of open field testing. In addition, we provide recommendations for optimal study design when using open field activity in a preclinical trial.

  20. A Measurement Model for Likert Responses that Incorporates Response Time

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Lorenzo-Seva, Urbano

    2007-01-01

    This article describes a model for response times that is proposed as a supplement to the usual factor-analytic model for responses to graded or more continuous typical-response items. The use of the proposed model together with the factor model provides additional information about the respondent and can potentially increase the accuracy of the…

  1. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism.

    PubMed

    Birkel, Garrett W; Ghosh, Amit; Kumar, Vinay S; Weaver, Daniel; Ando, David; Backman, Tyler W H; Arkin, Adam P; Keasling, Jay D; Martín, Héctor García

    2017-04-05

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics, proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13 C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13 C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13 C Metabolic Flux Analysis (2S- 13 C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.

  2. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DOE PAGES

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.; ...

    2017-04-05

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics,more » proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13C Metabolic Flux Analysis (2S- 13C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.« less

  3. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics,more » proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13C Metabolic Flux Analysis (2S- 13C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.« less

  4. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  5. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  6. Biological Model Development as an Opportunity to Provide Content Auditing for the Foundational Model of Anatomy Ontology.

    PubMed

    Wang, Lucy L; Grunblatt, Eli; Jung, Hyunggu; Kalet, Ira J; Whipple, Mark E

    2015-01-01

    Constructing a biological model using an established ontology provides a unique opportunity to perform content auditing on the ontology. We built a Markov chain model to study tumor metastasis in the regional lymphatics of patients with head and neck squamous cell carcinoma (HNSCC). The model attempts to determine regions with high likelihood for metastasis, which guides surgeons and radiation oncologists in selecting the boundaries of treatment. To achieve consistent anatomical relationships, the nodes in our model are populated using lymphatic objects extracted from the Foundational Model of Anatomy (FMA) ontology. During this process, we discovered several classes of inconsistencies in the lymphatic representations within the FMA. We were able to use this model building opportunity to audit the entities and connections in this region of interest (ROI). We found five subclasses of errors that are computationally detectable and resolvable, one subclass of errors that is computationally detectable but unresolvable, requiring the assistance of a content expert, and also errors of content, which cannot be detected through computational means. Mathematical descriptions of detectable errors along with expert review were used to discover inconsistencies and suggest concepts for addition and removal. Out of 106 organ and organ parts in the ROI, 8 unique entities were affected, leading to the suggestion of 30 concepts for addition and 4 for removal. Out of 27 lymphatic chain instances, 23 were found to have errors, with a total of 32 concepts suggested for addition and 15 concepts for removal. These content corrections are necessary for the accurate functioning of the FMA and provide benefits for future research and educational uses.

  7. Biological Model Development as an Opportunity to Provide Content Auditing for the Foundational Model of Anatomy Ontology

    PubMed Central

    Wang, Lucy L.; Grunblatt, Eli; Jung, Hyunggu; Kalet, Ira J.; Whipple, Mark E.

    2015-01-01

    Constructing a biological model using an established ontology provides a unique opportunity to perform content auditing on the ontology. We built a Markov chain model to study tumor metastasis in the regional lymphatics of patients with head and neck squamous cell carcinoma (HNSCC). The model attempts to determine regions with high likelihood for metastasis, which guides surgeons and radiation oncologists in selecting the boundaries of treatment. To achieve consistent anatomical relationships, the nodes in our model are populated using lymphatic objects extracted from the Foundational Model of Anatomy (FMA) ontology. During this process, we discovered several classes of inconsistencies in the lymphatic representations within the FMA. We were able to use this model building opportunity to audit the entities and connections in this region of interest (ROI). We found five subclasses of errors that are computationally detectable and resolvable, one subclass of errors that is computationally detectable but unresolvable, requiring the assistance of a content expert, and also errors of content, which cannot be detected through computational means. Mathematical descriptions of detectable errors along with expert review were used to discover inconsistencies and suggest concepts for addition and removal. Out of 106 organ and organ parts in the ROI, 8 unique entities were affected, leading to the suggestion of 30 concepts for addition and 4 for removal. Out of 27 lymphatic chain instances, 23 were found to have errors, with a total of 32 concepts suggested for addition and 15 concepts for removal. These content corrections are necessary for the accurate functioning of the FMA and provide benefits for future research and educational uses. PMID:26958311

  8. The heavy top quark and supersymmetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, L.J.

    1997-01-01

    Three aspects of supersymmetric theories are discussed: electroweak symmetry breaking, the issues of flavor, and gauge unification. The heavy top quark plays an important, sometimes dominant, role in each case. Additional symmetries lead to extensions of the Standard Model which can provide an understanding for many of the outstanding problems of particle physics. A broken supersymmetric extension of spacetime allows electroweak symmetry breaking to follow from the dynamics of the heavy top quark; an extension of isospin provides a constrained framework for understanding the pattern of quark and lepton masses; and a grand unified extension of the Standard Model gaugemore » group provides an elegant understanding of the gauge quantum numbers of the components of a generation. Experimental signatures for each of these additional symmetries are discussed.« less

  9. "Going the Extra Mile in Downscaling: Why Downscaling is not jut "Plug-and-Play"

    EPA Science Inventory

    This presentation provides an example of doing additional work for preprocessing global climate model data for use in regional climate modeling simulations with the Weather Research and Forecasting (WRF) model. In this presentation, results from 15 months of downscaling the Comm...

  10. Defense and the Economy

    DTIC Science & Technology

    1993-01-01

    Assumptions .......................................................... 15 b. Modeling Productivity ...and a macroeconomic model of the U.S. economy, designed to provide long-range projections 3 consistent with trends in production technology, shifts in...investments in roads, bridges, sewer systems, etc. In addition to these modeling assumptions, we also have introduced productivity increases to reflect the

  11. A Simulation Study Comparing Epidemic Dynamics on Exponential Random Graph and Edge-Triangle Configuration Type Contact Network Models

    PubMed Central

    Rolls, David A.; Wang, Peng; McBryde, Emma; Pattison, Philippa; Robins, Garry

    2015-01-01

    We compare two broad types of empirically grounded random network models in terms of their abilities to capture both network features and simulated Susceptible-Infected-Recovered (SIR) epidemic dynamics. The types of network models are exponential random graph models (ERGMs) and extensions of the configuration model. We use three kinds of empirical contact networks, chosen to provide both variety and realistic patterns of human contact: a highly clustered network, a bipartite network and a snowball sampled network of a “hidden population”. In the case of the snowball sampled network we present a novel method for fitting an edge-triangle model. In our results, ERGMs consistently capture clustering as well or better than configuration-type models, but the latter models better capture the node degree distribution. Despite the additional computational requirements to fit ERGMs to empirical networks, the use of ERGMs provides only a slight improvement in the ability of the models to recreate epidemic features of the empirical network in simulated SIR epidemics. Generally, SIR epidemic results from using configuration-type models fall between those from a random network model (i.e., an Erdős-Rényi model) and an ERGM. The addition of subgraphs of size four to edge-triangle type models does improve agreement with the empirical network for smaller densities in clustered networks. Additional subgraphs do not make a noticeable difference in our example, although we would expect the ability to model cliques to be helpful for contact networks exhibiting household structure. PMID:26555701

  12. Freight data architecture business process, logical data model, and physical data model.

    DOT National Transportation Integrated Search

    2014-09-01

    This document summarizes the study teams efforts to establish data-sharing partnerships : and relay the lessons learned. In addition, it provides information on a prototype freight data : architecture and supporting description and specifications ...

  13. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    NASA Astrophysics Data System (ADS)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  14. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    This draft report of Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. Topics covered include:the types of data required use of PBPK models in risk assessment,evaluation of PBPK models for use in risk assessment, andthe application of these models to address uncertainties resulting from extrapolations (e.g. interspecies extrapolation) often used in risk assessment.In addition, appendices are provided that includea compilation of chemical partition coefficients and rate constants,algorithms for estimating chemical-specific parameters, anda list of publications relating to PBPK modeling. This report is primarily meant to serve as a learning tool for EPA scientists and risk assessors who may be less familiar with the field. In addition, this report can be informative to PBPK modelers within and outside the Agency, as it provides an assessment of the types of data and models that the EPA requires for consideration of a model for use in risk assessment.

  15. Fine-mapping additive and dominant SNP effects using group-LASSO and Fractional Resample Model Averaging

    PubMed Central

    Sabourin, Jeremy; Nobel, Andrew B.; Valdar, William

    2014-01-01

    Genomewide association studies sometimes identify loci at which both the number and identities of the underlying causal variants are ambiguous. In such cases, statistical methods that model effects of multiple SNPs simultaneously can help disentangle the observed patterns of association and provide information about how those SNPs could be prioritized for follow-up studies. Current multi-SNP methods, however, tend to assume that SNP effects are well captured by additive genetics; yet when genetic dominance is present, this assumption translates to reduced power and faulty prioritizations. We describe a statistical procedure for prioritizing SNPs at GWAS loci that efficiently models both additive and dominance effects. Our method, LLARRMA-dawg, combines a group LASSO procedure for sparse modeling of multiple SNP effects with a resampling procedure based on fractional observation weights; it estimates for each SNP the robustness of association with the phenotype both to sampling variation and to competing explanations from other SNPs. In producing a SNP prioritization that best identifies underlying true signals, we show that: our method easily outperforms a single marker analysis; when additive-only signals are present, our joint model for additive and dominance is equivalent to or only slightly less powerful than modeling additive-only effects; and, when dominance signals are present, even in combination with substantial additive effects, our joint model is unequivocally more powerful than a model assuming additivity. We also describe how performance can be improved through calibrated randomized penalization, and discuss how dominance in ungenotyped SNPs can be incorporated through either heterozygote dosage or multiple imputation. PMID:25417853

  16. Updated radiometric calibration for the Landsat-5 thematic mapper reflective bands

    USGS Publications Warehouse

    Helder, D.L.; Markham, B.L.; Thome, K.J.; Barsi, J.A.; Chander, G.; Malla, R.

    2008-01-01

    The Landsat-5 Thematic Mapper (TM) has been the workhorse of the Landsat system. Launched in 1984, it continues collecting data through the time frame of this paper. Thus, it provides an invaluable link to the past history of the land features of the Earth's surface, and it becomes imperative to provide an accurate radiometric calibration of the reflective bands to the user community. Previous calibration has been based on information obtained from prelaunch, the onboard calibrator, vicarious calibration attempts, and cross-calibration with Landsat-7. Currently, additional data sources are available to improve this calibration. Specifically, improvements in vicarious calibration methods and development of the use of pseudoinvariant sites for trending provide two additional independent calibration sources. The use of these additional estimates has resulted in a consistent calibration approach that ties together all of the available calibration data sources. Results from this analysis indicate a simple exponential, or a constant model may be used for all bands throughout the lifetime of Landsat-5 TM. Where previously time constants for the exponential models were approximately one year, the updated model has significantly longer time constants in bands 1-3. In contrast, bands 4, 5, and 7 are shown to be best modeled by a constant. The models proposed in this paper indicate calibration knowledge of 5% or better early in life, decreasing to nearly 2% later in life. These models have been implemented at the U.S. Geological Survey Earth Resources Observation and Science (EROS) and are the default calibration used for all Landsat TM data now distributed through EROS. ?? 2008 IEEE.

  17. Two-warehouse partial backlogging inventory model for deteriorating items with linear trend in demand under inflationary conditions

    NASA Astrophysics Data System (ADS)

    Jaggi, Chandra K.; Khanna, Aditi; Verma, Priyanka

    2011-07-01

    In today's business transactions, there are various reasons, namely, bulk purchase discounts, re-ordering costs, seasonality of products, inflation induced demand, etc., which force the buyer to order more than the warehouse capacity. Such situations call for additional storage space to store the excess units purchased. This additional storage space is typically a rented warehouse. Inflation plays a very interesting and significant role here: It increases the cost of goods. To safeguard from the rising prices, during the inflation regime, the organisation prefers to keep a higher inventory, thereby increasing the aggregate demand. This additional inventory needs additional storage space, which is facilitated by a rented warehouse. Ignoring the effects of the time value of money and inflation might yield misleading results. In this study, a two-warehouse inventory model with linear trend in demand under inflationary conditions having different rates of deterioration has been developed. Shortages at the owned warehouse are also allowed subject to partial backlogging. The solution methodology provided in the model helps to decide on the feasibility of renting a warehouse. Finally, findings have been illustrated with the help of numerical examples. Comprehensive sensitivity analysis has also been provided.

  18. Evaluation of methodology for detecting/predicting migration of forest species

    Treesearch

    Dale S. Solomon; William B. Leak

    1996-01-01

    Available methods for analyzing migration of forest species are evaluated, including simulation models, remeasured plots, resurveys, pollen/vegetation analysis, and age/distance trends. Simulation models have provided some of the most drastic estimates of species changes due to predicted changes in global climate. However, these models require additional testing...

  19. A General Cognitive Diagnosis Model for Expert-Defined Polytomous Attributes

    ERIC Educational Resources Information Center

    Chen, Jinsong; de la Torre, Jimmy

    2013-01-01

    Polytomous attributes, particularly those defined as part of the test development process, can provide additional diagnostic information. The present research proposes the polytomous generalized deterministic inputs, noisy, "and" gate (pG-DINA) model to accommodate such attributes. The pG-DINA model allows input from substantive experts…

  20. Ethical Leaders Resist: A Feminist Response to ACCEL

    ERIC Educational Resources Information Center

    Brimager, Ashley

    2017-01-01

    This response to Sternberg's ACCEL (Active Concerned Citizenship and Ethical Leadership) model aims to provide a feminist perspective on the rationale and relevance of the model in mitigating harmful patriarchal gender expectations. Additionally, this response investigates potential barriers to the implementation of ACCEL as a model for gifted…

  1. AN ONLINE CATALOG OF CATACLYSMIC VARIABLE SPECTRA FROM THE FAR-ULTRAVIOLET SPECTROSCOPIC EXPLORER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godon, Patrick; Sion, Edward M.; Levay, Karen

    2012-12-15

    We present an online catalog containing spectra and supporting information for cataclysmic variables that have been observed with the Far-Ultraviolet Spectroscopic Explorer (FUSE). For each object in the catalog we list some of the basic system parameters such as (R.A., decl.), period, inclination, and white dwarf mass, as well as information on the available FUSE spectra: data ID, observation date and time, and exposure time. In addition, we provide parameters needed for the analysis of the FUSE spectra such as the reddening E(B - V), distance, and state (high, low, intermediate) of the system at the time it was observed.more » For some of these spectra we have carried out model fits to the continuum with synthetic stellar and/or disk spectra using the codes TLUSTY and SYNSPEC. We provide the parameters obtained from these model fits; this includes the white dwarf temperature, gravity, projected rotational velocity, and elemental abundances of C, Si, S, and N, together with the disk mass accretion rate, the resulting inclination, and model-derived distance (when unknown). For each object one or more figures are provided (as gif files) with line identification and model fit(s) when available. The FUSE spectra and the synthetic spectra are directly available for download as ASCII tables. References are provided for each object, as well as for the model fits. In this article we present 36 objects, and additional ones will be added to the online catalog in the future. In addition to cataclysmic variables, we also include a few related objects, such as a wind-accreting white dwarf, a pre-cataclysmic variable, and some symbiotics.« less

  2. Assessment of the Neutronic and Fuel Cycle Performance of the Transatomic Power Molten Salt Reactor Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Sean; Dewan, Leslie; Massie, Mark

    This report presents results from a collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear (GAIN) Nuclear Energy Voucher program. The TAP concept is a molten salt reactor using configurable zirconium hydride moderator rod assemblies to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches and time-dependent parametersmore » necessary to simulate the continuously changing physics in this complex system. The implementation of continuous-energy Monte Carlo transport and depletion tools in ChemTriton provide for full-core three-dimensional modeling and simulation. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this concept. Additional analyses of mass feed rates and enrichments, isotopic removals, tritium generation, core power distribution, core vessel helium generation, moderator rod heat deposition, and reactivity coeffcients provide additional information to make informed design decisions. This work demonstrates capabilities of ORNL modeling and simulation tools for neutronic and fuel cycle analysis of molten salt reactor concepts.« less

  3. Using Toxicological Evidence from QSAR Models in Practice

    EPA Science Inventory

    The new generation of QSAR models provides supporting documentation in addition to the predicted toxicological value. Such information enables the toxicologist to explore the properties of chemical substances and to review and increase the reliability of toxicity predictions. Thi...

  4. Nonlinear Unsteady Aerodynamic Modeling Using Wind Tunnel and Computational Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.

    2016-01-01

    Extensions to conventional aircraft aerodynamic models are required to adequately predict responses when nonlinear unsteady flight regimes are encountered, especially at high incidence angles and under maneuvering conditions. For a number of reasons, such as loss of control, both military and civilian aircraft may extend beyond normal and benign aerodynamic flight conditions. In addition, military applications may require controlled flight beyond the normal envelope, and civilian flight may require adequate recovery or prevention methods from these adverse conditions. These requirements have led to the development of more general aerodynamic modeling methods and provided impetus for researchers to improve both techniques and the degree of collaboration between analytical and experimental research efforts. In addition to more general mathematical model structures, dynamic test methods have been designed to provide sufficient information to allow model identification. This paper summarizes research to develop a modeling methodology appropriate for modeling aircraft aerodynamics that include nonlinear unsteady behaviors using both experimental and computational test methods. This work was done at Langley Research Center, primarily under the NASA Aviation Safety Program, to address aircraft loss of control, prevention, and recovery aerodynamics.

  5. Engaging Students in Modeling as an Epistemic Practice of Science: An Introduction to the Special Issue of the "Journal of Science Education and Technology"

    ERIC Educational Resources Information Center

    Campbell, Todd; Oh, Phil Seok

    2015-01-01

    This article provides an introduction for the special issue of the "Journal of Science Education and Technology" focused on science teaching and learning with models. The article provides initial framing for questions that guided the special issue. Additionally, based on our careful review of each of these articles, some discussion of…

  6. Video Self-Modeling in Children with Autism: A Pilot Study Validating Prerequisite Skills and Extending the Utilization of VSM across Skill Sets

    ERIC Educational Resources Information Center

    Williamson, Robert L.; Casey, Laura B.; Robertson, Janna Siegel; Buggey, Tom

    2013-01-01

    Given the recent interest in the use of video self-modeling (VSM) to provide instruction within iPod apps and other pieces of handheld mobile assistive technologies, investigating appropriate prerequisite skills for effective use of this intervention is particularly timely and relevant. To provide additional information regarding the efficacy of…

  7. Reptile embryology.

    PubMed

    Vickaryous, Matthew K; McLean, Katherine E

    2011-01-01

    Reptiles (lizards, snakes, turtles and crocodylians) are becoming increasing popular as models for developmental investigations. In this review the leopard gecko, Eublepharis macularius, is presented as a reptilian model for embryonic studies. We provide details of husbandry, breeding and modifications to two popular histological techniques (whole-mount histochemistry and immunohistochemistry). In addition, we provide a summary of basic reptilian husbandry requirements and discuss important details of embryonic nutrition, egg anatomy and sex determination.

  8. Phonon scattering in nanoscale systems: lowest order expansion of the current and power expressions

    NASA Astrophysics Data System (ADS)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2006-04-01

    We use the non-equilibrium Green's function method to describe the effects of phonon scattering on the conductance of nano-scale devices. Useful and accurate approximations are developed that both provide (i) computationally simple formulas for large systems and (ii) simple analytical models. In addition, the simple models can be used to fit experimental data and provide physical parameters.

  9. Adding thin-ideal internalization and impulsiveness to the cognitive-behavioral model of bulimic symptoms.

    PubMed

    Schnitzler, Caroline E; von Ranson, Kristin M; Wallace, Laurel M

    2012-08-01

    This study evaluated the cognitive-behavioral (CB) model of bulimia nervosa and an extension that included two additional maintaining factors - thin-ideal internalization and impulsiveness - in 327 undergraduate women. Participants completed measures of demographics, self-esteem, concern about shape and weight, dieting, bulimic symptoms, thin-ideal internalization, and impulsiveness. Both the original CB model and the extended model provided good fits to the data. Although structural equation modeling analyses suggested that the original CB model was most parsimonious, hierarchical regression analyses indicated that the additional variables accounted for significantly more variance. Additional analyses showed that the model fit could be improved by adding a path from concern about shape and weight, and deleting the path from dieting, to bulimic symptoms. Expanding upon the factors considered in the model may better capture the scope of variables maintaining bulimic symptoms in young women with a range of severity of bulimic symptoms. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Large eddy simulations of time-dependent and buoyancy-driven channel flows

    NASA Technical Reports Server (NTRS)

    Cabot, William H.

    1993-01-01

    The primary goal of this work has been to assess the performance of the dynamic SGS model in the large eddy simulation (LES) of channel flows in a variety of situations, viz., in temporal development of channel flow turned by a transverse pressure gradient and especially in buoyancy-driven turbulent flows such as Rayleigh-Benard and internally heated channel convection. For buoyancy-driven flows, there are additional buoyant terms that are possible in the base models, and one objective has been to determine if the dynamic SGS model results are sensitive to such terms. The ultimate goal is to determine the minimal base model needed in the dynamic SGS model to provide accurate results in flows with more complicated physical features. In addition, a program of direct numerical simulation (DNS) of fully compressible channel convection has been undertaken to determine stratification and compressibility effects. These simulations are intended to provide a comparative base for performing the LES of compressible (or highly stratified, pseudo-compressible) convection at high Reynolds number in the future.

  11. The NASA MSFC Earth Global Reference Atmospheric Model-2007 Version

    NASA Technical Reports Server (NTRS)

    Leslie, F.W.; Justus, C.G.

    2008-01-01

    Reference or standard atmospheric models have long been used for design and mission planning of various aerospace systems. The NASA/Marshall Space Flight Center (MSFC) Global Reference Atmospheric Model (GRAM) was developed in response to the need for a design reference atmosphere that provides complete global geographical variability, and complete altitude coverage (surface to orbital altitudes) as well as complete seasonal and monthly variability of the thermodynamic variables and wind components. A unique feature of GRAM is that, addition to providing the geographical, height, and monthly variation of the mean atmospheric state, it includes the ability to simulate spatial and temporal perturbations in these atmospheric parameters (e.g. fluctuations due to turbulence and other atmospheric perturbation phenomena). A summary comparing GRAM features to characteristics and features of other reference or standard atmospheric models, can be found Guide to Reference and Standard Atmosphere Models. The original GRAM has undergone a series of improvements over the years with recent additions and changes. The software program is called Earth-GRAM2007 to distinguish it from similar programs for other bodies (e.g. Mars, Venus, Neptune, and Titan). However, in order to make this Technical Memorandum (TM) more readable, the software will be referred to simply as GRAM07 or GRAM unless additional clarity is needed. Section 1 provides an overview of the basic features of GRAM07 including the newly added features. Section 2 provides a more detailed description of GRAM07 and how the model output generated. Section 3 presents sample results. Appendices A and B describe the Global Upper Air Climatic Atlas (GUACA) data and the Global Gridded Air Statistics (GGUAS) database. Appendix C provides instructions for compiling and running GRAM07. Appendix D gives a description of the required NAMELIST format input. Appendix E gives sample output. Appendix F provides a list of available parameters to enable the user to generate special output. Appendix G gives an example and guidance on incorporating GRAM07 as a subroutine in other programs such as trajectory codes or orbital propagation routines.

  12. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  13. Computational Aeroelastic Analyses of a Low-Boom Supersonic Configuration

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph

    2015-01-01

    An overview of NASA's Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) element is provided with a focus on recent computational aeroelastic analyses of a low-boom supersonic configuration developed by Lockheed-Martin and referred to as the N+2 configuration. The overview includes details of the computational models developed to date including a linear finite element model (FEM), linear unsteady aerodynamic models, unstructured CFD grids, and CFD-based aeroelastic analyses. In addition, a summary of the work involving the development of aeroelastic reduced-order models (ROMs) and the development of an aero-propulso-servo-elastic (APSE) model is provided.

  14. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  15. Risk stratification following acute myocardial infarction.

    PubMed

    Singh, Mandeep

    2007-07-01

    This article reviews the current risk assessment models available for patients presenting with myocardial infarction (MI). These practical tools enhance the health care provider's ability to rapidly and accurately assess patient risk from the event or revascularization therapy, and are of paramount importance in managing patients presenting with MI. This article highlights the models used for ST-elevation MI (STEMI) and non-ST elevation MI (NSTEMI) and provides an additional description of models used to assess risks after primary angioplasty (ie, angioplasty performed for STEMI).

  16. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  17. Updates on CCMC Activities and GSFC Space Weather Services

    NASA Technical Reports Server (NTRS)

    Zhengm Y.; Hesse, M.; Kuznetsova, M.; Pulkkinen, A.; Rastaetter, L.; Maddox, M.; Taktakishvili, A.; Berrios, D.; Chulaki, A.; Lee, H.; hide

    2011-01-01

    In this presentation, we provide updates on CCMC modeling activities, CCMC metrics and validation studies, and other CCMC efforts. In addition, an overview of GSFC Space Weather Services (a sibling organization to the Community Coordinated Modeling Center) and its products/capabilities will be given. We show how some of the research grade models, if running in an operational mode, can help address NASA's space weather needs by providing forecasting/now casting capabilities of significant space weather events throughout the solar system.

  18. Model Description and Proposed Application for the Enlisted Personnel Inventory, Cost, and Compensation Model

    DTIC Science & Technology

    1994-07-01

    provide additional information for the user / policy analyst: Eichers, D., Sola, M., McLernan, G., EPICC User’s Manual , Systems Research and Applications...maintenance, and a set of on-line help screens. Each are further discussed below and a full discussion is included in the EPICC User’s Manual . Menu Based...written documentation (user’s manual ) that will be provided with the model. 55 The next chapter discusses the validation of the inventory projection and

  19. Modeling Unproductive Behavior in Online Homework in Terms of Latent Student Traits: An Approach Based on Item Response Theory

    NASA Astrophysics Data System (ADS)

    Gönülateş, Emre; Kortemeyer, Gerd

    2017-04-01

    Homework is an important component of most physics courses. One of the functions it serves is to provide meaningful formative assessment in preparation for examinations. However, correlations between homework and examination scores tend to be low, likely due to unproductive student behavior such as copying and random guessing of answers. In this study, we attempt to model these two counterproductive learner behaviors within the framework of Item Response Theory in order to provide an ability measurement that strongly correlates with examination scores. We find that introducing additional item parameters leads to worse predictions of examination grades, while introducing additional learner traits is a more promising approach.

  20. Reciprocal Relations Between Cognitive Neuroscience and Cognitive Models: Opposites Attract?

    PubMed Central

    Forstmann, Birte U.; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T.

    2012-01-01

    Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal models of cognition can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent: not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide critical insights into formal models of cognition. PMID:21612972

  1. Extending the granularity of representation and control for the MIL-STD CAIS 1.0 node model

    NASA Technical Reports Server (NTRS)

    Rogers, Kathy L.

    1986-01-01

    The Common APSE (Ada 1 Program Support Environment) Interface Set (CAIS) (DoD85) node model provides an excellent baseline for interfaces in a single-host development environment. To encompass the entire spectrum of computing, however, the CAIS model should be extended in four areas. It should provide the interface between the engineering workstation and the host system throughout the entire lifecycle of the system. It should provide a basis for communication and integration functions needed by distributed host environments. It should provide common interfaces for communications mechanisms to and among target processors. It should provide facilities for integration, validation, and verification of test beds extending to distributed systems on geographically separate processors with heterogeneous instruction set architectures (ISAS). Additions to the PROCESS NODE model to extend the CAIS into these four areas are proposed.

  2. Modelling Mass Movements for Planetary Studies

    NASA Technical Reports Server (NTRS)

    Bulmer, M. H.; Glaze, L.; Barnouin-Jha, O.; Murphy, W.; Neumann, G.

    2002-01-01

    Use of an empirical model in conjunction with data from the Chaos Jumbles rock avalanches constrain to first order their flow behavior, and provide a method to interpret rock/debris avalanche emplacement on Mars. Additional information is contained in the original extended abstract.

  3. Nonlinear inversion of resistivity sounding data for 1-D earth models using the Neighbourhood Algorithm

    NASA Astrophysics Data System (ADS)

    Ojo, A. O.; Xie, Jun; Olorunfemi, M. O.

    2018-01-01

    To reduce ambiguity related to nonlinearities in the resistivity model-data relationships, an efficient direct-search scheme employing the Neighbourhood Algorithm (NA) was implemented to solve the 1-D resistivity problem. In addition to finding a range of best-fit models which are more likely to be global minimums, this method investigates the entire multi-dimensional model space and provides additional information about the posterior model covariance matrix, marginal probability density function and an ensemble of acceptable models. This provides new insights into how well the model parameters are constrained and make assessing trade-offs between them possible, thus avoiding some common interpretation pitfalls. The efficacy of the newly developed program is tested by inverting both synthetic (noisy and noise-free) data and field data from other authors employing different inversion methods so as to provide a good base for comparative performance. In all cases, the inverted model parameters were in good agreement with the true and recovered model parameters from other methods and remarkably correlate with the available borehole litho-log and known geology for the field dataset. The NA method has proven to be useful whilst a good starting model is not available and the reduced number of unknowns in the 1-D resistivity inverse problem makes it an attractive alternative to the linearized methods. Hence, it is concluded that the newly developed program offers an excellent complementary tool for the global inversion of the layered resistivity structure.

  4. Study of abrasive resistance of foundries models obtained with use of additive technology

    NASA Astrophysics Data System (ADS)

    Ol'khovik, Evgeniy

    2017-10-01

    A problem of determination of resistance of the foundry models and patterns from ABS (PLA) plastic, obtained by the method of 3D printing with using FDM additive technology, to abrasive wear and resistance in the environment of foundry sand mould is considered in the present study. The description of a technique and equipment for tests of castings models and patterns for wear is provided in the article. The manufacturing techniques of models with the use of the 3D printer (additive technology) are described. The scheme with vibration load was applied to samples tests. For the most qualitative research of influence of sandy mix on plastic, models in real conditions of abrasive wear have been organized. The results also examined the application of acrylic paintwork to the plastic model and a two-component coating. The practical offers and recommendation on production of master models with the use of FDM technology allowing one to reach indicators of durability, exceeding 2000 cycles of moulding in foundry sand mix, are described.

  5. Guarana Provides Additional Stimulation over Caffeine Alone in the Planarian Model

    PubMed Central

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R.; Constable, Mic Andre; Mulligan, Margaret E.; Voura, Evelyn B.

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  6. A feasibility study regarding the addition of a fifth control to a rotorcraft in-flight simulator

    NASA Technical Reports Server (NTRS)

    Turner, Simon; Andrisani, Dominick, II

    1992-01-01

    The addition of a large movable horizontal tail surface to the control system of a rotorcraft in-flight simulator being developed from a Sikorsky UH-60A Black Hawk Helicopter is evaluated. The capabilities of the control surface as a trim control and as an active control are explored. The helicopter dynamics are modeled using the Generic Helicopter simulation program developed by Sikorsky Aircraft. The effect of the horizontal tail on the helicopter trim envelope is examined by plotting trim maps of the aircraft attitude and controls as a function of the flight speed and horizontal tail incidence. The control power of the tail surface relative to that of the other controls is examined by comparing control derivatives extracted from the simulation program over the flight speed envelope. The horizontal tail's contribution as an active control is evaluated using an explicit model following control synthesis involving a linear model of the helicopter in steady, level flight at a flight speed of eighty knots. The horizontal tail is found to provide additional control flexibility in the longitudinal axis. As a trim control, it provides effective control of the trim pitch attitude at mid to high forward speeds. As an active control, the horizontal tail provides useful pitching moment generating capabilities at mid to high forward speeds.

  7. smoothG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barker, Andrew T.; Gelever, Stephan A.; Lee, Chak S.

    2017-12-12

    smoothG is a collection of parallel C++ classes/functions that algebraically constructs reduced models of different resolutions from a given high-fidelity graph model. In addition, smoothG also provides efficient linear solvers for the reduced models. Other than pure graph problem, the software finds its application in subsurface flow and power grid simulations in which graph Laplacians are found

  8. Comparing species distribution models constructed with different subsets of environmental predictors

    USGS Publications Warehouse

    Bucklin, David N.; Basille, Mathieu; Benscoter, Allison M.; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.; Speroterra, Carolina; Watling, James I.

    2014-01-01

    Our results indicate that additional predictors have relatively minor effects on the accuracy of climate-based species distribution models and minor to moderate effects on spatial predictions. We suggest that implementing species distribution models with only climate predictors may provide an effective and efficient approach for initial assessments of environmental suitability.

  9. The Manifest Association Structure of the Single-Factor Model: Insights from Partial Correlations

    ERIC Educational Resources Information Center

    Salgueiro, Maria de Fatima; Smith, Peter W. F.; McDonald, John W.

    2008-01-01

    The association structure between manifest variables arising from the single-factor model is investigated using partial correlations. The additional insights to the practitioner provided by partial correlations for detecting a single-factor model are discussed. The parameter space for the partial correlations is presented, as are the patterns of…

  10. A Modeling Approach to the Development of Students' Informal Inferential Reasoning

    ERIC Educational Resources Information Center

    Doerr, Helen M.; Delmas, Robert; Makar, Katie

    2017-01-01

    Teaching from an informal statistical inference perspective can address the challenge of teaching statistics in a coherent way. We argue that activities that promote model-based reasoning address two additional challenges: providing a coherent sequence of topics and promoting the application of knowledge to novel situations. We take a models and…

  11. Fundamentals of Adaptive Intelligent Tutoring Systems for Self-Regulated Learning

    DTIC Science & Technology

    2015-03-01

    has 4 fundamental elements: a learner model, a pedagogical (instructional) model, a domain model, and a communication model. Figure 5 shows a...The TUI has been discussed in detail, so now the learner, pedagogical , and domain modules will be reviewed:  Learner module. In addition to...shared states, which are provided to the pedagogical module.  Pedagogical module. The pedagogical module models the instructional techniques

  12. The stay/switch model describes choice among magnitudes of reinforcers.

    PubMed

    MacDonall, James S

    2008-06-01

    The stay/switch model is an alternative to the generalized matching law for describing choice in concurrent procedures. The purpose of the present experiment was to extend this model to choice among magnitudes of reinforcers. Rats were exposed to conditions in which the magnitude of reinforcers (number of food pellets) varied for staying at alternative 1, switching from alternative 1, staying at alternative 2 and switching from alternative 2. A changeover delay was not used. The results showed that the stay/switch model provided a good account of the data overall, and deviations from fits of the generalized matching law to response allocation data were in the direction predicted by the stay/switch model. In addition, comparisons among specific conditions suggested that varying the ratio of obtained reinforcers, as in the generalized matching law, was not necessary to change the response and time allocations. Other comparisons suggested that varying the ratio of obtained reinforcers was not sufficient to change response allocation. Taken together these results provide additional support for the stay/switch model of concurrent choice.

  13. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  14. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  15. Content Model Use and Development to Redeem Thin Section Records

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2014-12-01

    The National Geothermal Data System (NGDS) is a catalog of documents and datasets that provide information about geothermal resources located primarily within the United States. The goal of NGDS is to make large quantities of geothermal-relevant geoscience data available to the public by creating a national, sustainable, distributed, and interoperable network of data providers. The Geological Survey of Alabama (GSA) has been a data provider in the initial phase of NGDS. One method by which NGDS facilitates interoperability is through the use of content models. Content models provide a schema (structure) for submitted data. Schemas dictate where and how data should be entered. Content models use templates that simplify data formatting to expedite use by data providers. These methodologies implemented by NGDS can extend beyond geothermal data to all geoscience data. The GSA, using the NGDS physical samples content model, has tested and refined a content model for thin sections and thin section photos. Countless thin sections have been taken from oil and gas well cores housed at the GSA, and many of those thin sections have related photomicrographs. Record keeping for these thin sections has been scattered at best, and it is critical to capture their metadata while the content creators are still available. A next step will be to register the GSA's thin sections with SESAR (System for Earth Sample Registration) and assign an IGSN (International Geo Sample Number) to each thin section. Additionally, the thin section records will be linked to the GSA's online record database. When complete, the GSA's thin sections will be more readily discoverable and have greater interoperability. Moving forward, the GSA is implementing use of NGDS-like content models and registration with SESAR and IGSN to improve collection maintenance and management of additional physical samples.

  16. RenderView: physics-based multi- and hyperspectral rendering using measured background panoramics

    NASA Astrophysics Data System (ADS)

    Talcott, Denise M.; Brown, Wade W.; Thomas, David J.

    2003-09-01

    As part of the survivability engineering process it is necessary to accurately model and visualize the vehicle signatures in multi- or hyperspectral bands of interest. The signature at a given wavelength is a function of the surface optical properties, reflection of the background and, in the thermal region, the emission of thermal radiation. Currently, it is difficult to obtain and utilize background models that are of sufficient fidelity when compared with the vehicle models. In addition, the background models create an additional layer of uncertainty in estimating the vehicles signature. Therefore, to meet exacting rendering requirements we have developed RenderView, which incorporates the full bidirectional reflectance distribution function (BRDF). Instead of using a modeled background we have incorporated a measured calibrated background panoramic image to provide the high fidelity background interaction. Uncertainty in the background signature is reduced to the error in the measurement which is considerably smaller than the uncertainty inherent in a modeled background. RenderView utilizes a number of different descriptions of the BRDF, including the Sandford-Robertson. In addition, it provides complete conservation of energy with off axis sampling. A description of RenderView will be presented along with a methodology developed for collecting background panoramics. Examples of the RenderView output and the background panoramics will be presented along with our approach to handling the solar irradiance problem.

  17. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  18. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Wellmann, J. F.; Thiele, S. T.; Lindsay, M. D.; Jessell, M. W.

    2015-11-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  19. FPGA implementation of a biological neural network based on the Hodgkin-Huxley neuron model.

    PubMed

    Yaghini Bonabi, Safa; Asgharian, Hassan; Safari, Saeed; Nili Ahmadabadi, Majid

    2014-01-01

    A set of techniques for efficient implementation of Hodgkin-Huxley-based (H-H) model of a neural network on FPGA (Field Programmable Gate Array) is presented. The central implementation challenge is H-H model complexity that puts limits on the network size and on the execution speed. However, basics of the original model cannot be compromised when effect of synaptic specifications on the network behavior is the subject of study. To solve the problem, we used computational techniques such as CORDIC (Coordinate Rotation Digital Computer) algorithm and step-by-step integration in the implementation of arithmetic circuits. In addition, we employed different techniques such as sharing resources to preserve the details of model as well as increasing the network size in addition to keeping the network execution speed close to real time while having high precision. Implementation of a two mini-columns network with 120/30 excitatory/inhibitory neurons is provided to investigate the characteristic of our method in practice. The implementation techniques provide an opportunity to construct large FPGA-based network models to investigate the effect of different neurophysiological mechanisms, like voltage-gated channels and synaptic activities, on the behavior of a neural network in an appropriate execution time. Additional to inherent properties of FPGA, like parallelism and re-configurability, our approach makes the FPGA-based system a proper candidate for study on neural control of cognitive robots and systems as well.

  20. Additive Manufacturing/Diagnostics via the High Frequency Induction Heating of Metal Powders: The Determination of the Power Transfer Factor for Fine Metallic Spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rios, Orlando; Radhakrishnan, Balasubramaniam; Caravias, George

    2015-03-11

    Grid Logic Inc. is developing a method for sintering and melting fine metallic powders for additive manufacturing using spatially-compact, high-frequency magnetic fields called Micro-Induction Sintering (MIS). One of the challenges in advancing MIS technology for additive manufacturing is in understanding the power transfer to the particles in a powder bed. This knowledge is important to achieving efficient power transfer, control, and selective particle heating during the MIS process needed for commercialization of the technology. The project s work provided a rigorous physics-based model for induction heating of fine spherical particles as a function of frequency and particle size. This simulationmore » improved upon Grid Logic s earlier models and provides guidance that will make the MIS technology more effective. The project model will be incorporated into Grid Logic s power control circuit of the MIS 3D printer product and its diagnostics technology to optimize the sintering process for part quality and energy efficiency.« less

  1. Evaluation of RCAS Inflow Models for Wind Turbine Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tangler, J.; Bir, G.

    The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.

  2. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, Antoine; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  3. Nurse practitioners and physician assistants: preparing new providers for hospital medicine at the mayo clinic.

    PubMed

    Spychalla, Megan T; Heathman, Joanne H; Pearson, Katherine A; Herber, Andrew J; Newman, James S

    2014-01-01

    Hospital medicine is a growing field with an increasing demand for additional healthcare providers, especially in the face of an aging population. Reductions in resident duty hours, coupled with a continued deficit of medical school graduates to appropriately meet the demand, require an additional workforce to counter the shortage. A major dilemma of incorporating nonphysician providers such as nurse practitioners and physician assistants (NPPAs) into a hospital medicine practice is their varying academic backgrounds and inpatient care experiences. Medical institutions seeking to add NPPAs to their hospital medicine practice need a structured orientation program and ongoing NPPA educational support. This article outlines an NPPA orientation and training program within the Division of Hospital Internal Medicine (HIM) at the Mayo Clinic in Rochester, MN. In addition to a practical orientation program that other institutions can model and implement, the division of HIM also developed supplemental learning modalities to maintain ongoing NPPA competencies and fill learning gaps, including a formal NPPA hospital medicine continuing medical education (CME) course, an NPPA simulation-based boot camp, and the first hospital-based NPPA grand rounds offering CME credit. Since the NPPA orientation and training program was implemented, NPPAs within the division of HIM have gained a reputation for possessing a strong clinical skill set coupled with a depth of knowledge in hospital medicine. The NPPA-physician model serves as an alternative care practice, and we believe that with the institution of modalities, including a structured orientation program, didactic support, hands-on learning, and professional growth opportunities, NPPAs are capable of fulfilling the gap created by provider shortages and resident duty hour restrictions. Additionally, the use of NPPAs in hospital medicine allows for patient care continuity that is otherwise missing with resident practice models.

  4. Contribution of Submarine Groundwater on the Water-Food Nexus in Coastal Ecosystems: Effects on Biodiversity and Fishery Production

    NASA Astrophysics Data System (ADS)

    Shoji, J.; Sugimoto, R.; Honda, H.; Tominaga, O.; Taniguchi, M.

    2014-12-01

    In the past decade, machine-learning methods for empirical rainfall-runoff modeling have seen extensive development. However, the majority of research has focused on a small number of methods, such as artificial neural networks, while not considering other approaches for non-parametric regression that have been developed in recent years. These methods may be able to achieve comparable predictive accuracy to ANN's and more easily provide physical insights into the system of interest through evaluation of covariate influence. Additionally, these methods could provide a straightforward, computationally efficient way of evaluating climate change impacts in basins where data to support physical hydrologic models is limited. In this paper, we use multiple regression and machine-learning approaches to predict monthly streamflow in five highly-seasonal rivers in the highlands of Ethiopia. We find that generalized additive models, random forests, and cubist models achieve better predictive accuracy than ANNs in many basins assessed and are also able to outperform physical models developed for the same region. We discuss some challenges that could hinder the use of such models for climate impact assessment, such as biases resulting from model formulation and prediction under extreme climate conditions, and suggest methods for preventing and addressing these challenges. Finally, we demonstrate how predictor variable influence can be assessed to provide insights into the physical functioning of data-sparse watersheds.

  5. Structured functional additive regression in reproducing kernel Hilbert spaces.

    PubMed

    Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen

    2014-06-01

    Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.

  6. Half a degree additional warming, prognosis and projected impacts (HAPPI): background and experimental design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Daniel; AchutaRao, Krishna; Allen, Myles

    The Intergovernmental Panel on Climate Change (IPCC) has accepted the invitation from the UNFCCC to provide a special report on the impacts of global warming of 1.5 °C above pre-industrial levels and on related global greenhouse-gas emission pathways. Many current experiments in, for example, the Coupled Model Inter-comparison Project (CMIP), are not specifically designed for informing this report. Here, we document the design of the half a degree additional warming, projections, prognosis and impacts (HAPPI) experiment. HAPPI provides a framework for the generation of climate data describing how the climate, and in particular extreme weather, might differ from the presentmore » day in worlds that are 1.5 and 2.0 °C warmer than pre-industrial conditions. Output from participating climate models includes variables frequently used by a range of impact models. The key challenge is to separate the impact of an additional approximately half degree of warming from uncertainty in climate model responses and internal climate variability that dominate CMIP-style experiments under low-emission scenarios.Large ensembles of simulations (> 50 members) of atmosphere-only models for three time slices are proposed, each a decade in length: the first being the most recent observed 10-year period (2006–2015), the second two being estimates of a similar decade but under 1.5 and 2 °C conditions a century in the future. We use the representative concentration pathway 2.6 (RCP2.6) to provide the model boundary conditions for the 1.5 °C scenario, and a weighted combination of RCP2.6 and RCP4.5 for the 2 °C scenario.« less

  7. Half a degree additional warming, prognosis and projected impacts (HAPPI): background and experimental design

    DOE PAGES

    Mitchell, Daniel; AchutaRao, Krishna; Allen, Myles; ...

    2017-02-08

    The Intergovernmental Panel on Climate Change (IPCC) has accepted the invitation from the UNFCCC to provide a special report on the impacts of global warming of 1.5 °C above pre-industrial levels and on related global greenhouse-gas emission pathways. Many current experiments in, for example, the Coupled Model Inter-comparison Project (CMIP), are not specifically designed for informing this report. Here, we document the design of the half a degree additional warming, projections, prognosis and impacts (HAPPI) experiment. HAPPI provides a framework for the generation of climate data describing how the climate, and in particular extreme weather, might differ from the presentmore » day in worlds that are 1.5 and 2.0 °C warmer than pre-industrial conditions. Output from participating climate models includes variables frequently used by a range of impact models. The key challenge is to separate the impact of an additional approximately half degree of warming from uncertainty in climate model responses and internal climate variability that dominate CMIP-style experiments under low-emission scenarios.Large ensembles of simulations (> 50 members) of atmosphere-only models for three time slices are proposed, each a decade in length: the first being the most recent observed 10-year period (2006–2015), the second two being estimates of a similar decade but under 1.5 and 2 °C conditions a century in the future. We use the representative concentration pathway 2.6 (RCP2.6) to provide the model boundary conditions for the 1.5 °C scenario, and a weighted combination of RCP2.6 and RCP4.5 for the 2 °C scenario.« less

  8. Manpower Mix for Health Services

    PubMed Central

    Shuman, Larry J.; Young, John P.; Naddor, Eliezer

    1971-01-01

    A model is formulated to determine the mix of manpower and technology needed to provide health services of acceptable quality at a minimum total cost to the community. Total costs include both the direct costs associated with providing the services and with developing additional manpower and the indirect costs (shortage costs) resulting from not providing needed services. The model is applied to a hypothetical neighborhood health center, and its sensitivity to alternative policies is investigated by cost-benefit analyses. Possible extensions of the model to include dynamic elements in health delivery systems are discussed, as is its adaptation for use in hospital planning, with a changed objective function. PMID:5095652

  9. NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2013-01-01

    The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.

  10. Reciprocal relations between cognitive neuroscience and formal cognitive models: opposites attract?

    PubMed

    Forstmann, Birte U; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T

    2011-06-01

    Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal cognitive models can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of formal cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent; not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide key insights into formal models of cognition. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Multiaxial Fatigue Damage Parameter and Life Prediction without Any Additional Material Constants

    PubMed Central

    Yu, Zheng-Yong; Liu, Qiang; Liu, Yunhan

    2017-01-01

    Based on the critical plane approach, a simple and efficient multiaxial fatigue damage parameter with no additional material constants is proposed for life prediction under uniaxial/multiaxial proportional and/or non-proportional loadings for titanium alloy TC4 and nickel-based superalloy GH4169. Moreover, two modified Ince-Glinka fatigue damage parameters are put forward and evaluated under different load paths. Results show that the generalized strain amplitude model provides less accurate life predictions in the high cycle life regime and is better for life prediction in the low cycle life regime; however, the generalized strain energy model is relatively better for high cycle life prediction and is conservative for low cycle life prediction under multiaxial loadings. In addition, the Fatemi–Socie model is introduced for model comparison and its additional material parameter k is found to not be a constant and its usage is discussed. Finally, model comparison and prediction error analysis are used to illustrate the superiority of the proposed damage parameter in multiaxial fatigue life prediction of the two aviation alloys under various loadings. PMID:28792487

  12. Multiaxial Fatigue Damage Parameter and Life Prediction without Any Additional Material Constants.

    PubMed

    Yu, Zheng-Yong; Zhu, Shun-Peng; Liu, Qiang; Liu, Yunhan

    2017-08-09

    Based on the critical plane approach, a simple and efficient multiaxial fatigue damage parameter with no additional material constants is proposed for life prediction under uniaxial/multiaxial proportional and/or non-proportional loadings for titanium alloy TC4 and nickel-based superalloy GH4169. Moreover, two modified Ince-Glinka fatigue damage parameters are put forward and evaluated under different load paths. Results show that the generalized strain amplitude model provides less accurate life predictions in the high cycle life regime and is better for life prediction in the low cycle life regime; however, the generalized strain energy model is relatively better for high cycle life prediction and is conservative for low cycle life prediction under multiaxial loadings. In addition, the Fatemi-Socie model is introduced for model comparison and its additional material parameter k is found to not be a constant and its usage is discussed. Finally, model comparison and prediction error analysis are used to illustrate the superiority of the proposed damage parameter in multiaxial fatigue life prediction of the two aviation alloys under various loadings.

  13. Determination of Littlest Higgs Model Parameters at the ILC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conley, John A.; Hewett, JoAnne; Le, My Phuong

    2005-07-27

    We examine the effects of the extended gauge sector of the Littlest Higgs model in high energy e{sup +}e{sup -} collisions. We find that the search reach in e{sup +}e{sup -} {yields} f{bar f} at a {radical}s = 500 GeV International Linear Collider covers essentially the entire parameter region where the Littlest Higgs model is relevant to the gauge hierarchy problem. In addition, we show that this channel provides an accurate determination of the fundamental model parameters, to the precision of a few percent, provided that the LHC measures the mass of the heavy neutral gauge .eld. Additionally, we showmore » that the couplings of the extra gauge bosons to the light Higgs can be observed from the process e{sup +}e{sup -} {yields} Zh for a significant region of the parameter space. This allows for confirmation of the structure of the cancellation of the Higgs mass quadratic divergence and would verify the little Higgs mechanism.« less

  14. Mechanics of a mosquito bite with applications to microneedle design.

    PubMed

    Ramasubramanian, M K; Barham, O M; Swaminathan, V

    2008-12-01

    The mechanics of a fascicle insertion into the skin by a mosquito of the type aedes aegypti has been studied experimentally using high-speed video (HSV) imaging, and analytically using a mathematical model. The fascicle is a polymeric microneedle composed of a ductile material, chitin. It has been proposed that the mosquito applies a non-conservative follower force component in addition to the Euler compressive load in order to prevent buckling and penetrate the skin. In addition, the protective sheath surrounding the fascicle (labium) provides lateral support during insertion. The mechanics model presented approximates the fascicle as a slender column supported on an elastic foundation (labium) subjected to non-conservative (Beck) and conservative Euler loads simultaneously at the end. Results show that the lateral support of the fascicle provided by the labium is essential for successful penetration by increasing the critical buckling load by a factor of 5. The non-conservative follower force application increases the buckling load by an additional 20% and may or may not be necessary for successful penetration. Experimental results showing the importance of the labium have been cited to validate the model predictions, in addition to the video observations presented in this work. This understanding may be useful in designing painless needle insertion systems as opposed to miniaturized hypodermic needles.

  15. Conservative Exposure Predictions for Rapid Risk Assessment of Phase-Separated Additives in Medical Device Polymers.

    PubMed

    Chandrasekar, Vaishnavi; Janes, Dustin W; Saylor, David M; Hood, Alan; Bajaj, Akhil; Duncan, Timothy V; Zheng, Jiwen; Isayeva, Irada S; Forrey, Christopher; Casey, Brendan J

    2018-01-01

    A novel approach for rapid risk assessment of targeted leachables in medical device polymers is proposed and validated. Risk evaluation involves understanding the potential of these additives to migrate out of the polymer, and comparing their exposure to a toxicological threshold value. In this study, we propose that a simple diffusive transport model can be used to provide conservative exposure estimates for phase separated color additives in device polymers. This model has been illustrated using a representative phthalocyanine color additive (manganese phthalocyanine, MnPC) and polymer (PEBAX 2533) system. Sorption experiments of MnPC into PEBAX were conducted in order to experimentally determine the diffusion coefficient, D = (1.6 ± 0.5) × 10 -11  cm 2 /s, and matrix solubility limit, C s  = 0.089 wt.%, and model predicted exposure values were validated by extraction experiments. Exposure values for the color additive were compared to a toxicological threshold for a sample risk assessment. Results from this study indicate that a diffusion model-based approach to predict exposure has considerable potential for use as a rapid, screening-level tool to assess the risk of color additives and other small molecule additives in medical device polymers.

  16. Practical Approaches to Protein Folding and Assembly

    PubMed Central

    Walters, Jad; Milam, Sara L.; Clark, A. Clay

    2009-01-01

    We describe here the use of several spectroscopies, such as fluorescence emission, circular dichroism, and differential quenching by acrylamide, in examining the equilibrium and kinetic folding of proteins. The first section regarding equilibrium techniques provides practical information for determining the conformational stability of a protein. In addition, several equilibrium-folding models are discussed, from two-state monomer to four-state homodimer, providing a comprehensive protocol for interpretation of folding curves. The second section focuses on the experimental design and interpretation of kinetic data, such as burst-phase analysis and exponential fits, used in elucidating kinetic folding pathways. In addition, simulation programs are used routinely to support folding models generated by kinetic experiments, and the fundamentals of simulations are covered. PMID:19289201

  17. Neural and Computational Mechanisms of Action Processing: Interaction between Visual and Motor Representations.

    PubMed

    Giese, Martin A; Rizzolatti, Giacomo

    2015-10-07

    Action recognition has received enormous interest in the field of neuroscience over the last two decades. In spite of this interest, the knowledge in terms of fundamental neural mechanisms that provide constraints for underlying computations remains rather limited. This fact stands in contrast with a wide variety of speculative theories about how action recognition might work. This review focuses on new fundamental electrophysiological results in monkeys, which provide constraints for the detailed underlying computations. In addition, we review models for action recognition and processing that have concrete mathematical implementations, as opposed to conceptual models. We think that only such implemented models can be meaningfully linked quantitatively to physiological data and have a potential to narrow down the many possible computational explanations for action recognition. In addition, only concrete implementations allow judging whether postulated computational concepts have a feasible implementation in terms of realistic neural circuits. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Recent Progress Towards Predicting Aircraft Ground Handling Performance

    NASA Technical Reports Server (NTRS)

    Yager, T. J.; White, E. J.

    1981-01-01

    The significant progress which has been achieved in development of aircraft ground handling simulation capability is reviewed and additional improvements in software modeling identified. The problem associated with providing necessary simulator input data for adequate modeling of aircraft tire/runway friction behavior is discussed and efforts to improve this complex model, and hence simulator fidelity, are described. Aircraft braking performance data obtained on several wet runway surfaces is compared to ground vehicle friction measurements and, by use of empirically derived methods, good agreement between actual and estimated aircraft braking friction from ground vehilce data is shown. The performance of a relatively new friction measuring device, the friction tester, showed great promise in providing data applicable to aircraft friction performance. Additional research efforts to improve methods of predicting tire friction performance are discussed including use of an instrumented tire test vehicle to expand the tire friction data bank and a study of surface texture measurement techniques.

  19. MULTI: a shared memory approach to cooperative molecular modeling.

    PubMed

    Darden, T; Johnson, P; Smith, H

    1991-03-01

    A general purpose molecular modeling system, MULTI, based on the UNIX shared memory and semaphore facilities for interprocess communication is described. In addition to the normal querying or monitoring of geometric data, MULTI also provides processes for manipulating conformations, and for displaying peptide or nucleic acid ribbons, Connolly surfaces, close nonbonded contacts, crystal-symmetry related images, least-squares superpositions, and so forth. This paper outlines the basic techniques used in MULTI to ensure cooperation among these specialized processes, and then describes how they can work together to provide a flexible modeling environment.

  20. Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Reid, Terry; Schifer, Nicholas; Briggs, Maxwell

    2011-01-01

    Past methods of predicting net heat input needed to be validated. Validation effort pursued with several paths including improving model inputs, using test hardware to provide validation data, and validating high fidelity models. Validation test hardware provided direct measurement of net heat input for comparison to predicted values. Predicted value of net heat input was 1.7 percent less than measured value and initial calculations of measurement uncertainty were 2.1 percent (under review). Lessons learned during validation effort were incorporated into convertor modeling approach which improved predictions of convertor efficiency.

  1. Logic Models as a Way to Support Online Students and Their Projects

    ERIC Educational Resources Information Center

    Strycker, Jesse

    2016-01-01

    As online enrollment continues to grow, students may need additional pedagogical supports to increase their likelihood of success in online environments that don't offer the same supports as those found in face to face classrooms. Logic models are a way to provide such support to students by helping to model project expectations, allowing students…

  2. Revisiting Frazier's subdeltas: enhancing datasets with dimensionality, better to understand geologic systems

    USGS Publications Warehouse

    Flocks, James

    2006-01-01

    Scientific knowledge from the past century is commonly represented by two-dimensional figures and graphs, as presented in manuscripts and maps. Using today's computer technology, this information can be extracted and projected into three- and four-dimensional perspectives. Computer models can be applied to datasets to provide additional insight into complex spatial and temporal systems. This process can be demonstrated by applying digitizing and modeling techniques to valuable information within widely used publications. The seminal paper by D. Frazier, published in 1967, identified 16 separate delta lobes formed by the Mississippi River during the past 6,000 yrs. The paper includes stratigraphic descriptions through geologic cross-sections, and provides distribution and chronologies of the delta lobes. The data from Frazier's publication are extensively referenced in the literature. Additional information can be extracted from the data through computer modeling. Digitizing and geo-rectifying Frazier's geologic cross-sections produce a three-dimensional perspective of the delta lobes. Adding the chronological data included in the report provides the fourth-dimension of the delta cycles, which can be visualized through computer-generated animation. Supplemental information can be added to the model, such as post-abandonment subsidence of the delta-lobe surface. Analyzing the regional, net surface-elevation balance between delta progradations and land subsidence is computationally intensive. By visualizing this process during the past 4,500 yrs through multi-dimensional animation, the importance of sediment compaction in influencing both the shape and direction of subsequent delta progradations becomes apparent. Visualization enhances a classic dataset, and can be further refined using additional data, as well as provide a guide for identifying future areas of study.

  3. Mean-variance model for portfolio optimization with background risk based on uncertainty theory

    NASA Astrophysics Data System (ADS)

    Zhai, Jia; Bai, Manying

    2018-04-01

    The aim of this paper is to develop a mean-variance model for portfolio optimization considering the background risk, liquidity and transaction cost based on uncertainty theory. In portfolio selection problem, returns of securities and assets liquidity are assumed as uncertain variables because of incidents or lacking of historical data, which are common in economic and social environment. We provide crisp forms of the model and a hybrid intelligent algorithm to solve it. Under a mean-variance framework, we analyze the portfolio frontier characteristic considering independently additive background risk. In addition, we discuss some effects of background risk and liquidity constraint on the portfolio selection. Finally, we demonstrate the proposed models by numerical simulations.

  4. A regularized variable selection procedure in additive hazards model with stratified case-cohort design.

    PubMed

    Ni, Ai; Cai, Jianwen

    2018-07-01

    Case-cohort designs are commonly used in large epidemiological studies to reduce the cost associated with covariate measurement. In many such studies the number of covariates is very large. An efficient variable selection method is needed for case-cohort studies where the covariates are only observed in a subset of the sample. Current literature on this topic has been focused on the proportional hazards model. However, in many studies the additive hazards model is preferred over the proportional hazards model either because the proportional hazards assumption is violated or the additive hazards model provides more relevent information to the research question. Motivated by one such study, the Atherosclerosis Risk in Communities (ARIC) study, we investigate the properties of a regularized variable selection procedure in stratified case-cohort design under an additive hazards model with a diverging number of parameters. We establish the consistency and asymptotic normality of the penalized estimator and prove its oracle property. Simulation studies are conducted to assess the finite sample performance of the proposed method with a modified cross-validation tuning parameter selection methods. We apply the variable selection procedure to the ARIC study to demonstrate its practical use.

  5. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  6. Scientific Writing: A Blended Instructional Model

    ERIC Educational Resources Information Center

    Clark, MaryAnn; Olson, Valerie

    2010-01-01

    Scientific writing is composed of a unique skill set and corresponding instructional strategies are critical to foster learning. In an age of technology, the blended instructional model provides the instrumental format for student mastery of the scientific writing competencies. In addition, the course management program affords opportunities for…

  7. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    NASA Astrophysics Data System (ADS)

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  8. Detecting and Quantifying Paleoseasonality in Stalagmites using Geochemical and Modelling Approaches

    NASA Astrophysics Data System (ADS)

    Baldini, J. U. L.

    2017-12-01

    Stalagmites are now well established sources of terrestrial paleoclimate information, providing insights into climate change on a variety of timescales. One of the most exciting aspects of stalagmites as climate archives is their ability to provide information regarding seasonality, a notoriously difficult component of climate change to characterise. However, stalagmite geochemistry may reflect not only the most apparent seasonal signal in external climate parameters, but also cave-specific signals such as seasonal changes in cave air carbon dioxide concentrations, sudden shifts in ventilation, and stochastic hydrological processes. Additionally, analytical bias may dampen or completely obfuscate any paleoseasonality, highlighting the need for appropriate quantification of this issue using simple models. Evidence from stalagmites now suggests that a seasonal signal is extractable from many samples, and that this signal can provide an important extra dimension to paleoclimate interpretations. Additionally, lower resolution annual- to decadal-scale isotope ratio records may also reflect shifts in seasonality, but identifying these is often challenging. Integrating geochemical datasets with models and cave monitoring data can greatly increase the accuracy of climate reconstructions, and yield the most robust records.

  9. The ASTER Volcano Archive (AVA): High Spatial Resolution Global Monitoring of Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Linick, J. P.; Pieri, D. C.; Davies, A. G.; Reath, K.; Mars, J. C.; Hubbard, B. E.; Sanchez, R. M.; Tan, H. L.

    2017-12-01

    The ASTER Volcano Archive (AVA) is a data system focused on collecting and cataloguing higher level remote sensing data products for all Holocene volcanoes over the last several decades, producing volcanogenic science products for global detection, mapping, and modeling of effusive eruptions at high spatial resolution, and providing rapid bulk dissemination of relevant data products to the science community at large. Space-based optical platforms such as ASTER, EO-1, and Landsat, are a critical component for global monitoring systems to provide the capability for volcanic hazard assessment and modeling, and are a vital addition to in-situ measurements. The AVA leverages these instruments for the automated generation of lava flow emplacement maps, sulfur dioxide monitoring, thermal anomaly detection, and modeling of integrated thermal emission across the world's volcanoes. Additionally, we provide slope classified alteration and lahar inundation maps with potential inundation zones for certain relevant volcanoes. We explore the AVA's data product retrieval API, and describe how scientists can rapidly retrieve bulk products using the AVA platform with a focus on practical applications for both general analysis and hazard response.

  10. 3D voxel modelling of the marine subsurface: the Belgian Continental Shelf case

    NASA Astrophysics Data System (ADS)

    Hademenos, Vasileios; Kint, Lars; Missiaen, Tine; Stafleu, Jan; Van Lancker, Vera

    2017-04-01

    The need for marine space grows bigger by the year. Dredging, wind farms, aggregate extraction and many other activities take up more space than ever before. As a result, the need for an accurate model that describes the properties of the areas in use is a priority. To address this need a 3D voxel model of the subsurface of the Belgian part of the North Sea has been created in the scope of the Belgian Science Policy project TILES ('Transnational and Integrated Long-term Marine Exploitation Strategies'). Since borehole data in the marine environment are a costly endeavour and therefore relatively scarce, seismic data have been incorporated in order to improve the data coverage. Lithostratigraphic units have been defined and lithoclasses are attributed to the voxels using a stochastic interpolation. As a result each voxel contains a unique value of one of 7 lithological classes (spanning in grain size from clay to gravel) in association with the geological layer it belongs to. In addition other forms of interpolation like sequential indicator simulation have allowed us to calculate the probability occurrence of each lithoclass, thus providing additional info from which the uncertainty of the model can be derived. The resulting 3D voxel model gives a detailed image of the distribution of different sediment types and provides valuable insight on the different geological settings. The voxel model also allows to estimate resource volumes (e.g. the availability of particular sand classes), enabling a more targeted exploitation. The primary information of the model is related to geology, but the model can additionally host any type of information.

  11. Utility and Scope of Rapid Prototyping in Patients with Complex Muscular Ventricular Septal Defects or Double-Outlet Right Ventricle: Does it Alter Management Decisions?

    PubMed

    Bhatla, Puneet; Tretter, Justin T; Ludomirsky, Achi; Argilla, Michael; Latson, Larry A; Chakravarti, Sujata; Barker, Piers C; Yoo, Shi-Joon; McElhinney, Doff B; Wake, Nicole; Mosca, Ralph S

    2017-01-01

    Rapid prototyping facilitates comprehension of complex cardiac anatomy. However, determining when this additional information proves instrumental in patient management remains a challenge. We describe our experience with patient-specific anatomic models created using rapid prototyping from various imaging modalities, suggesting their utility in surgical and interventional planning in congenital heart disease (CHD). Virtual and physical 3-dimensional (3D) models were generated from CT or MRI data, using commercially available software for patients with complex muscular ventricular septal defects (CMVSD) and double-outlet right ventricle (DORV). Six patients with complex anatomy and uncertainty of the optimal management strategy were included in this study. The models were subsequently used to guide management decisions, and the outcomes reviewed. 3D models clearly demonstrated the complex intra-cardiac anatomy in all six patients and were utilized to guide management decisions. In the three patients with CMVSD, one underwent successful endovascular device closure following a prior failed attempt at transcatheter closure, and the other two underwent successful primary surgical closure with the aid of 3D models. In all three cases of DORV, the models provided better anatomic delineation and additional information that altered or confirmed the surgical plan. Patient-specific 3D heart models show promise in accurately defining intra-cardiac anatomy in CHD, specifically CMVSD and DORV. We believe these models improve understanding of the complex anatomical spatial relationships in these defects and provide additional insight for pre/intra-interventional management and surgical planning.

  12. Provider-Independent Use of the Cloud

    NASA Astrophysics Data System (ADS)

    Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron

    Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.

  13. Structured functional additive regression in reproducing kernel Hilbert spaces

    PubMed Central

    Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen

    2013-01-01

    Summary Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application. PMID:25013362

  14. Graphic comparison of reserve-growth models for conventional oil and accumulation

    USGS Publications Warehouse

    Klett, T.R.

    2003-01-01

    The U.S. Geological Survey (USGS) periodically assesses crude oil, natural gas, and natural gas liquids resources of the world. The assessment procedure requires estimated recover-able oil and natural gas volumes (field size, cumulative production plus remaining reserves) in discovered fields. Because initial reserves are typically conservative, subsequent estimates increase through time as these fields are developed and produced. The USGS assessment of petroleum resources makes estimates, or forecasts, of the potential additions to reserves in discovered oil and gas fields resulting from field development, and it also estimates the potential fully developed sizes of undiscovered fields. The term ?reserve growth? refers to the commonly observed upward adjustment of reserve estimates. Because such additions are related to increases in the total size of a field, the USGS uses field sizes to model reserve growth. Future reserve growth in existing fields is a major component of remaining U.S. oil and natural gas resources and has therefore become a necessary element of U.S. petroleum resource assessments. Past and currently proposed reserve-growth models compared herein aid in the selection of a suitable set of forecast functions to provide an estimate of potential additions to reserves from reserve growth in the ongoing National Oil and Gas Assessment Project (NOGA). Reserve growth is modeled by construction of a curve that represents annual fractional changes of recoverable oil and natural gas volumes (for fields and reservoirs), which provides growth factors. Growth factors are used to calculate forecast functions, which are sets of field- or reservoir-size multipliers. Comparisons of forecast functions were made based on datasets used to construct the models, field type, modeling method, and length of forecast span. Comparisons were also made between forecast functions based on field-level and reservoir- level growth, and between forecast functions based on older and newer data. The reserve-growth model used in the 1995 USGS National Assessment and the model currently used in the NOGA project provide forecast functions that yield similar estimates of potential additions to reserves. Both models are based on the Oil and Gas Integrated Field File from the Energy Information Administration (EIA), but different vintages of data (from 1977 through 1991 and 1977 through 1996, respectively). The model based on newer data can be used in place of the previous model, providing similar estimates of potential additions to reserves. Fore-cast functions for oil fields vary little from those for gas fields in these models; therefore, a single function may be used for both oil and gas fields, like that used in the USGS World Petroleum Assessment 2000. Forecast functions based on the field-level reserve growth model derived from the NRG Associates databases (from 1982 through 1998) differ from those derived from EIA databases (from 1977 through 1996). However, the difference may not be enough to preclude the use of the forecast functions derived from NRG data in place of the forecast functions derived from EIA data. Should the model derived from NRG data be used, separate forecast functions for oil fields and gas fields must be employed. The forecast function for oil fields from the model derived from NRG data varies significantly from that for gas fields, and a single function for both oil and gas fields may not be appropriate.

  15. Econometric models for predicting confusion crop ratios

    NASA Technical Reports Server (NTRS)

    Umberger, D. E.; Proctor, M. H.; Clark, J. E.; Eisgruber, L. M.; Braschler, C. B. (Principal Investigator)

    1979-01-01

    Results for both the United States and Canada show that econometric models can provide estimates of confusion crop ratios that are more accurate than historical ratios. Whether these models can support the LACIE 90/90 accuracy criterion is uncertain. In the United States, experimenting with additional model formulations could provide improved methods models in some CRD's, particularly in winter wheat. Improved models may also be possible for the Canadian CD's. The more aggressive province/state models outperformed individual CD/CRD models. This result was expected partly because acreage statistics are based on sampling procedures, and the sampling precision declines from the province/state to the CD/CRD level. Declining sampling precision and the need to substitute province/state data for the CD/CRD data introduced measurement error into the CD/CRD models.

  16. A Transport Equation Approach to Modeling the Influence of Surface Roughness on Boundary Layer Transition

    NASA Astrophysics Data System (ADS)

    Langel, Christopher Michael

    A computational investigation has been performed to better understand the impact of surface roughness on the flow over a contaminated surface. This thesis highlights the implementation and development of the roughness amplification model in the flow solver OVERFLOW-2. The model, originally proposed by Dassler, Kozulovic, and Fiala, introduces an additional scalar field roughness amplification quantity. This value is explicitly set at rough wall boundaries using surface roughness parameters and local flow quantities. This additional transport equation allows non-local effects of surface roughness to be accounted for downstream of rough sections. This roughness amplification variable is coupled with the Langtry-Menter model and used to modify the criteria for transition. Results from flat plate test cases show good agreement with experimental transition behavior on the flow over varying sand grain roughness heights. Additional validation studies were performed on a NACA 0012 airfoil with leading edge roughness. The computationally predicted boundary layer development demonstrates good agreement with experimental results. New tests using varying roughness configurations are being carried out at the Texas A&M Oran W. Nicks Low Speed Wind Tunnel to provide further calibration of the roughness amplification method. An overview and preliminary results are provided of this concurrent experimental investigation.

  17. Predictive computation of genomic logic processing functions in embryonic development

    PubMed Central

    Peter, Isabelle S.; Faure, Emmanuel; Davidson, Eric H.

    2012-01-01

    Gene regulatory networks (GRNs) control the dynamic spatial patterns of regulatory gene expression in development. Thus, in principle, GRN models may provide system-level, causal explanations of developmental process. To test this assertion, we have transformed a relatively well-established GRN model into a predictive, dynamic Boolean computational model. This Boolean model computes spatial and temporal gene expression according to the regulatory logic and gene interactions specified in a GRN model for embryonic development in the sea urchin. Additional information input into the model included the progressive embryonic geometry and gene expression kinetics. The resulting model predicted gene expression patterns for a large number of individual regulatory genes each hour up to gastrulation (30 h) in four different spatial domains of the embryo. Direct comparison with experimental observations showed that the model predictively computed these patterns with remarkable spatial and temporal accuracy. In addition, we used this model to carry out in silico perturbations of regulatory functions and of embryonic spatial organization. The model computationally reproduced the altered developmental functions observed experimentally. Two major conclusions are that the starting GRN model contains sufficiently complete regulatory information to permit explanation of a complex developmental process of gene expression solely in terms of genomic regulatory code, and that the Boolean model provides a tool with which to test in silico regulatory circuitry and developmental perturbations. PMID:22927416

  18. The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval

    DTIC Science & Technology

    2006-07-01

    reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We

  19. FPGA implementation of a biological neural network based on the Hodgkin-Huxley neuron model

    PubMed Central

    Yaghini Bonabi, Safa; Asgharian, Hassan; Safari, Saeed; Nili Ahmadabadi, Majid

    2014-01-01

    A set of techniques for efficient implementation of Hodgkin-Huxley-based (H-H) model of a neural network on FPGA (Field Programmable Gate Array) is presented. The central implementation challenge is H-H model complexity that puts limits on the network size and on the execution speed. However, basics of the original model cannot be compromised when effect of synaptic specifications on the network behavior is the subject of study. To solve the problem, we used computational techniques such as CORDIC (Coordinate Rotation Digital Computer) algorithm and step-by-step integration in the implementation of arithmetic circuits. In addition, we employed different techniques such as sharing resources to preserve the details of model as well as increasing the network size in addition to keeping the network execution speed close to real time while having high precision. Implementation of a two mini-columns network with 120/30 excitatory/inhibitory neurons is provided to investigate the characteristic of our method in practice. The implementation techniques provide an opportunity to construct large FPGA-based network models to investigate the effect of different neurophysiological mechanisms, like voltage-gated channels and synaptic activities, on the behavior of a neural network in an appropriate execution time. Additional to inherent properties of FPGA, like parallelism and re-configurability, our approach makes the FPGA-based system a proper candidate for study on neural control of cognitive robots and systems as well. PMID:25484854

  20. Accounting for dominance to improve genomic evaluations of dairy cows for fertility and milk production traits.

    PubMed

    Aliloo, Hassan; Pryce, Jennie E; González-Recio, Oscar; Cocks, Benjamin G; Hayes, Ben J

    2016-02-01

    Dominance effects may contribute to genetic variation of complex traits in dairy cattle, especially for traits closely related to fitness such as fertility. However, traditional genetic evaluations generally ignore dominance effects and consider additive genetic effects only. Availability of dense single nucleotide polymorphisms (SNPs) panels provides the opportunity to investigate the role of dominance in quantitative variation of complex traits at both the SNP and animal levels. Including dominance effects in the genomic evaluation of animals could also help to increase the accuracy of prediction of future phenotypes. In this study, we estimated additive and dominance variance components for fertility and milk production traits of genotyped Holstein and Jersey cows in Australia. The predictive abilities of a model that accounts for additive effects only (additive), and a model that accounts for both additive and dominance effects (additive + dominance) were compared in a fivefold cross-validation. Estimates of the proportion of dominance variation relative to phenotypic variation that is captured by SNPs, for production traits, were up to 3.8 and 7.1 % in Holstein and Jersey cows, respectively, whereas, for fertility, they were equal to 1.2 % in Holstein and very close to zero in Jersey cows. We found that including dominance in the model was not consistently advantageous. Based on maximum likelihood ratio tests, the additive + dominance model fitted the data better than the additive model, for milk, fat and protein yields in both breeds. However, regarding the prediction of phenotypes assessed with fivefold cross-validation, including dominance effects in the model improved accuracy only for fat yield in Holstein cows. Regression coefficients of phenotypes on genetic values and mean squared errors of predictions showed that the predictive ability of the additive + dominance model was superior to that of the additive model for some of the traits. In both breeds, dominance effects were significant (P < 0.01) for all milk production traits but not for fertility. Accuracy of prediction of phenotypes was slightly increased by including dominance effects in the genomic evaluation model. Thus, it can help to better identify highly performing individuals and be useful for culling decisions.

  1. Estimating abundance in the presence of species uncertainty

    USGS Publications Warehouse

    Chambert, Thierry A.; Hossack, Blake R.; Fishback, LeeAnn; Davenport, Jon M.

    2016-01-01

    1.N-mixture models have become a popular method for estimating abundance of free-ranging animals that are not marked or identified individually. These models have been used on count data for single species that can be identified with certainty. However, co-occurring species often look similar during one or more life stages, making it difficult to assign species for all recorded captures. This uncertainty creates problems for estimating species-specific abundance and it can often limit life stages to which we can make inference. 2.We present a new extension of N-mixture models that accounts for species uncertainty. In addition to estimating site-specific abundances and detection probabilities, this model allows estimating probability of correct assignment of species identity. We implement this hierarchical model in a Bayesian framework and provide all code for running the model in BUGS-language programs. 3.We present an application of the model on count data from two sympatric freshwater fishes, the brook stickleback (Culaea inconstans) and the ninespine stickleback (Pungitius pungitius), ad illustrate implementation of covariate effects (habitat characteristics). In addition, we used a simulation study to validate the model and illustrate potential sample size issues. We also compared, for both real and simulated data, estimates provided by our model to those obtained by a simple N-mixture model when captures of unknown species identification were discarded. In the latter case, abundance estimates appeared highly biased and very imprecise, while our new model provided unbiased estimates with higher precision. 4.This extension of the N-mixture model should be useful for a wide variety of studies and taxa, as species uncertainty is a common issue. It should notably help improve investigation of abundance and vital rate characteristics of organisms’ early life stages, which are sometimes more difficult to identify than adults.

  2. Simulating Surface Oil Transport During the Deepwater Horizon Oil Spill: Experiments with the BioCast System

    DTIC Science & Technology

    2014-01-25

    Virtual Special Issue Gulf of Mexico Modelling – Lessons from the spill Simulating surface oil transport during the Deepwater Horizon oil spill ...ocean surface materials. The Deepwater Horizon oil spill in the Gulf of Mexico provided a test case for the Bio-Optical Forecasting (BioCast) system...addition of explicit sources and sinks of surface oil concentrations provides a framework for increasingly complex oil spill modeling efforts that extend

  3. Review of a model to assess stranding of juvenile salmon by ship wakes along the Lower Columbia River, Oregon and Washington

    USGS Publications Warehouse

    Kock, Tobias J.; Plumb, John M.; Adams, Noah S.

    2013-01-01

    Long period wake waves from deep draft vessels have been shown to strand small fish, particularly juvenile Chinook salmon Oncorhynchus tschawytcha, in the lower Columbia River (LCR). The U.S. Army Corps of Engineers is responsible for maintaining the shipping channel in the LCR and recently conducted dredging operations to deepen the shipping channel from an authorized depth of 40 feet(ft) to an authorized depth of 43 ft (in areas where rapid shoaling was expected, dredging operations were used to increase the channel depth to 48 ft). A model was developed to estimate stranding probabilities for juvenile salmon under the 40- and 43-ft channel scenarios, to determine if channel deepening was going to affect wake stranding (Assessment of potential stranding of juvenile salmon by ship wakes along the Lower Columbia River under scenarios of ship traffic and channel depth: Report prepared for the Portland District U.S. Army Corps of Engineers, Portland, Oregon). The U.S. Army Corps of Engineers funded the U.S. Geological Survey to review this model. A total of 30 review questions were provided to guide the review process, and these questions are addressed in this report. In general, we determined that the analyses by Pearson (2011) were appropriate given the data available. We did identify two areas where additional information could have been provided: (1) a more thorough description of model diagnostics and model selection would have been useful for the reader to better understand the model framework; and (2) model uncertainty should have been explicitly described and reported in the document. Stranding probability estimates between the 40- and 43-ft channel depths were minimally different under most of the scenarios that were examined by Pearson (2011), and a discussion of the effects of uncertainty given these minimal differences would have been useful. Ultimately, however, a stochastic (or simulation) model would provide the best opportunity to illustrate uncertainty within a given set of model predictions, but such an approach would require a substantial amount of additional data collection. Several review questions focused on the accuracy and precision of the model estimates, but we were unable to address these questions because of the limited data that currently exists regarding wake stranding in the LCR. Additional field studies will be required to validate findings from Pearson (2011), if concerns regarding accuracy and precision remain a priority. Although the Pearson (2011) model provided a useful examination of stranding under pre-construction and post-construction conditions, future research will be required to better understand the effects of wake stranding on juvenile salmonids throughout the entire LCR. If additional information on wake stranding is desired in the future, the following topics may be of interest: (1) spatial examination of wake stranding throughout the entire LCR; (2) additional evaluation of juvenile salmonid behavior and population dynamics; (3) assessing and integrating predicted changes in ship development; and (4) assessing and integrating predicted changes in climate on environmental factors known to cause stranding.

  4. SBML Level 3 package: Groups, Version 1 Release 1

    PubMed Central

    Hucka, Michael; Smith, Lucian P.

    2017-01-01

    Summary Biological models often contain components that have relationships with each other, or that modelers want to treat as belonging to groups with common characteristics or shared metadata. The SBML Level 3 Version 1 Core specification does not provide an explicit mechanism for expressing such relationships, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Groups package for SBML Level 3 adds the necessary features to SBML to allow grouping of model components to be expressed. Such groups do not affect the mathematical interpretation of a model, but they do provide a way to add information that can be useful for modelers and software tools. The SBML Groups package enables a modeler to include definitions of groups and nested groups, each of which may be annotated to convey why that group was created, and what it represents. PMID:28187406

  5. Global sensitivity analysis of a dynamic agroecosystem model under different irrigation treatments

    USDA-ARS?s Scientific Manuscript database

    Savings in consumptive use through limited or deficit irrigation in agriculture has become an increasingly viable source of additional water for places with high population growth such as the Colorado Front Range, USA. Crop models provide a mechanism to evaluate various management methods without pe...

  6. Model-Driven Development for PDS4 Software and Services

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Algermissen, S. S.; Cayanan, M. D.; Joyner, R. S.; Hardman, S. H.; Padams, J. H.

    2018-04-01

    PDS4 data product labels provide the information necessary for processing the referenced digital object. However, significantly more information is available in the PDS4 Information Model. This additional information is made available for use, by both software and services, to configure, promote resiliency, and improve interoperability.

  7. Evaluation of the Bess TRS-CA Using the Rasch Rating Scale Model

    ERIC Educational Resources Information Center

    DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    This study examined the Behavioral and Emotional Screening System Teacher Rating System for Children and Adolescents (BESS TRS-CA; Kamphaus & Reynolds, 2007) screener using Rasch Rating Scale model (RSM) methodology to provide additional information about psychometric properties of items. Data from the Behavioral Assessment System for Children…

  8. Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.

    PubMed

    Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A

    2017-03-01

    Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.

  9. Enhanced di-Higgs boson production in the complex Higgs singlet model

    DOE PAGES

    Dawson, S.; Sullivan, M.

    2018-01-31

    Here, we consider the standard model (SM) extended by the addition of a complex scalar singlet, with no assumptions about additional symmetries of the potential. This model provides for resonant di-Higgs production of Higgs particles with different masses. We demonstrate that regions of parameter space allowed by precision electroweak measurements, experimental limits on single Higgs production, and perturbative unitarity allow for large di-Higgs production rates relative to the SM rates. In this scenario, the dominant production mechanism of the new scalar states is di-Higgs production. Results are presented formore » $$\\sqrt{s}$$ = 13, 27 and 100 TeV.« less

  10. Enhanced di-Higgs boson production in the complex Higgs singlet model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, S.; Sullivan, M.

    Here, we consider the standard model (SM) extended by the addition of a complex scalar singlet, with no assumptions about additional symmetries of the potential. This model provides for resonant di-Higgs production of Higgs particles with different masses. We demonstrate that regions of parameter space allowed by precision electroweak measurements, experimental limits on single Higgs production, and perturbative unitarity allow for large di-Higgs production rates relative to the SM rates. In this scenario, the dominant production mechanism of the new scalar states is di-Higgs production. Results are presented formore » $$\\sqrt{s}$$ = 13, 27 and 100 TeV.« less

  11. Building out a Measurement Model to Incorporate Complexities of Testing in the Language Domain

    ERIC Educational Resources Information Center

    Wilson, Mark; Moore, Stephen

    2011-01-01

    This paper provides a summary of a novel and integrated way to think about the item response models (most often used in measurement applications in social science areas such as psychology, education, and especially testing of various kinds) from the viewpoint of the statistical theory of generalized linear and nonlinear mixed models. In addition,…

  12. Turbulence modeling for hypersonic flight

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.

    1993-01-01

    The objective of the proposed work is to continue to develop, verify, and incorporate the baseline two-equation turbulence models, which account for the effects of compressibility at high speeds, into a three-dimensional Reynolds averaged Navier-Stokes (RANS) code. Additionally, we plan to provide documented descriptions of the models and their numerical procedures so that they can be implemented into the NASP CFD codes.

  13. A corrected formulation for marginal inference derived from two-part mixed models for longitudinal semi-continuous data

    PubMed Central

    Su, Li; Farewell, Vernon T

    2013-01-01

    For semi-continuous data which are a mixture of true zeros and continuously distributed positive values, the use of two-part mixed models provides a convenient modelling framework. However, deriving population-averaged (marginal) effects from such models is not always straightforward. Su et al. presented a model that provided convenient estimation of marginal effects for the logistic component of the two-part model but the specification of marginal effects for the continuous part of the model presented in that paper was based on an incorrect formulation. We present a corrected formulation and additionally explore the use of the two-part model for inferences on the overall marginal mean, which may be of more practical relevance in our application and more generally. PMID:24201470

  14. Tularosa Basin Play Fairway Analysis Data and Models

    DOE Data Explorer

    Nash, Greg

    2017-07-11

    This submission includes raster datasets for each layer of evidence used for weights of evidence analysis as well as the deterministic play fairway analysis (PFA). Data representative of heat, permeability and groundwater comprises some of the raster datasets. Additionally, the final deterministic PFA model is provided along with a certainty model. All of these datasets are best used with an ArcGIS software package, specifically Spatial Data Modeler.

  15. Health and economic impact of PHiD-CV in Canada and the UK: a Markov modelling exercise.

    PubMed

    Knerer, Gerhart; Ismaila, Afisi; Pearce, David

    2012-01-01

    The spectrum of diseases caused by Streptococcus pneumoniae and non-typeable Haemophilus influenzae (NTHi) represents a large burden on healthcare systems around the world. Meningitis, bacteraemia, community-acquired pneumonia (CAP), and acute otitis media (AOM) are vaccine-preventable infectious diseases that can have severe consequences. The health economic model presented here is intended to estimate the clinical and economic impact of vaccinating birth cohorts in Canada and the UK with the 10-valent, pneumococcal non-typeable Haemophilus influenzae protein D conjugate vaccine (PHiD-CV) compared with the newly licensed 13-valent pneumococcal conjugate vaccine (PCV-13). The model described herein is a Markov cohort model built to simulate the epidemiological burden of pneumococcal- and NTHi-related diseases within birth cohorts in the UK and Canada. Base-case assumptions include estimates of vaccine efficacy and NTHi infection rates that are based on published literature. The model predicts that the two vaccines will provide a broadly similar impact on all-cause invasive disease and CAP under base-case assumptions. However, PHiD-CV is expected to provide a substantially greater reduction in AOM compared with PCV-13, offering additional savings of Canadian $9.0 million and £4.9 million in discounted direct medical costs in Canada and the UK, respectively. The main limitations of the study are the difficulties in modelling indirect vaccine effects (herd effect and serotype replacement), the absence of PHiD-CV- and PCV-13-specific efficacy data and a lack of comprehensive NTHi surveillance data. Additional limitations relate to the fact that the transmission dynamics of pneumococcal serotypes have not been modelled, nor has antibiotic resistance been accounted for in this paper. This cost-effectiveness analysis suggests that, in Canada and the UK, PHiD-CV's potential to protect against NTHi infections could provide a greater impact on overall disease burden than the additional serotypes contained in PCV-13.

  16. The role of the electrolyte in the selective dissolution of metal alloys

    NASA Astrophysics Data System (ADS)

    Policastro, Steven A.

    Dealloying plays an important role in several corrosion processes, including pitting corrosion through the formation of local cathodes from the selective dissolution of intermetallic particles and stress-corrosion cracking in which it is responsible for injecting cracks from the surface into the undealloyed bulk material. Additionally, directed dealloying in the laboratory to form nanoporous structures has been the subject of much recent study because of the unique structural properties that the porous layer provides. In order to better understand the physical reasons for dealloying as well as understand the parameters that influence the evolution of the microstructure, several models have been proposed. Current theoretical descriptions of dealloying have been very successful in explaining some features of selective dissolution but additional behaviors can be included into the model to improve understanding of the dealloying process. In the present work, the effects of electrolyte component interactions, temperature, alloy cohesive energies, and applied potential on the development of nanoporosity via the selective dissolution of the less-noble component from binary and ternary alloys are considered. Both a kinetic Monte-Carlo (KMC) model of the behavior of the metal atoms and the electrolyte ions at the metal-solution interface and a phase-yield model of ligament coarsening are developed. By adding these additional parameters into the KMC model, a rich set of behaviors is observed in the simulation results. From the simulation results, it is suggested that selectively dissolving a binary alloy in a very aggressive electrolyte that targeted the LN atoms could provide a porous microstructure that retained a higher concentration of the LN atoms in its ligaments and thus retain more of the mechanical properties of the bulk alloy. In addition, by adding even a small fraction of a third, noble component to form a ternary alloy the dissolution kinetics of the least noble component can be dramatically altered, providing a means of controlling dealloying depth. Some molecular dynamics calculations are used to justify the assumptions of metal atom motion in the KMC model. A recently developed parameter-space exploration technique, COERCE, is employed to optimize the process of obtaining meaningful parameter values from the KMC simulation.

  17. Combat Wound Initiative program.

    PubMed

    Stojadinovic, Alexander; Elster, Eric; Potter, Benjamin K; Davis, Thomas A; Tadaki, Doug K; Brown, Trevor S; Ahlers, Stephen; Attinger, Christopher E; Andersen, Romney C; Burris, David; Centeno, Jose; Champion, Hunter; Crumbley, David R; Denobile, John; Duga, Michael; Dunne, James R; Eberhardt, John; Ennis, William J; Forsberg, Jonathan A; Hawksworth, Jason; Helling, Thomas S; Lazarus, Gerald S; Milner, Stephen M; Mullick, Florabel G; Owner, Christopher R; Pasquina, Paul F; Patel, Chirag R; Peoples, George E; Nissan, Aviram; Ring, Michael; Sandberg, Glenn D; Schaden, Wolfgang; Schultz, Gregory S; Scofield, Tom; Shawen, Scott B; Sheppard, Forest R; Stannard, James P; Weina, Peter J; Zenilman, Jonathan M

    2010-07-01

    The Combat Wound Initiative (CWI) program is a collaborative, multidisciplinary, and interservice public-private partnership that provides personalized, state-of-the-art, and complex wound care via targeted clinical and translational research. The CWI uses a bench-to-bedside approach to translational research, including the rapid development of a human extracorporeal shock wave therapy (ESWT) study in complex wounds after establishing the potential efficacy, biologic mechanisms, and safety of this treatment modality in a murine model. Additional clinical trials include the prospective use of clinical data, serum and wound biomarkers, and wound gene expression profiles to predict wound healing/failure and additional clinical patient outcomes following combat-related trauma. These clinical research data are analyzed using machine-based learning algorithms to develop predictive treatment models to guide clinical decision-making. Future CWI directions include additional clinical trials and study centers and the refinement and deployment of our genetically driven, personalized medicine initiative to provide patient-specific care across multiple medical disciplines, with an emphasis on combat casualty care.

  18. Evaluation of diffusion kurtosis imaging in ex vivo hypomyelinated mouse brains.

    PubMed

    Kelm, Nathaniel D; West, Kathryn L; Carson, Robert P; Gochberg, Daniel F; Ess, Kevin C; Does, Mark D

    2016-01-01

    Diffusion tensor imaging (DTI), diffusion kurtosis imaging (DKI), and DKI-derived white matter tract integrity metrics (WMTI) were experimentally evaluated ex vivo through comparisons to histological measurements and established magnetic resonance imaging (MRI) measures of myelin in two knockout mouse models with varying degrees of hypomyelination. DKI metrics of mean and radial kurtosis were found to be better indicators of myelin content than conventional DTI metrics. The biophysical WMTI model based on the DKI framework reported on axon water fraction with good accuracy in cases with near normal axon density, but did not provide additional specificity to myelination. Overall, DKI provided additional information regarding white matter microstructure compared with DTI, making it an attractive method for future assessments of white matter development and pathology. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Modeling of Powder Bed Manufacturing Defects

    NASA Astrophysics Data System (ADS)

    Mindt, H.-W.; Desmaison, O.; Megahed, M.; Peralta, A.; Neumann, J.

    2018-01-01

    Powder bed additive manufacturing offers unmatched capabilities. The deposition resolution achieved is extremely high enabling the production of innovative functional products and materials. Achieving the desired final quality is, however, hampered by many potential defects that have to be managed in due course of the manufacturing process. Defects observed in products manufactured via powder bed fusion have been studied experimentally. In this effort we have relied on experiments reported in the literature and—when experimental data were not sufficient—we have performed additional experiments providing an extended foundation for defect analysis. There is large interest in reducing the effort and cost of additive manufacturing process qualification and certification using integrated computational material engineering. A prerequisite is, however, that numerical methods can indeed capture defects. A multiscale multiphysics platform is developed and applied to predict and explain the origin of several defects that have been observed experimentally during laser-based powder bed fusion processes. The models utilized are briefly introduced. The ability of the models to capture the observed defects is verified. The root cause of the defects is explained by analyzing the numerical results thus confirming the ability of numerical methods to provide a foundation for rapid process qualification.

  20. Direct Visualization of Aggregate Morphology and Dynamics in a Model Soil Organic–Mineral System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufschmid, Ryan; Newcomb, Christina J.; Grate, Jay W.

    Interactions between mineral surfaces and organic matter are ubiquitous in soils and the environment. Through both physical and chemical mechanisms, organic-mineral assemblages prevent decomposition of soil organic matter by limiting accessibility or reducing efficacy of enzymes and microbes. To understand the mechanisms underlying organic-mineral interactions, researchers have begun to interrogate these systems at smaller length scales. Current techniques that maintain a hydrated state and allow researchers to characterize nanometer length scales are limited. Here we chose a model organic-mineral system and performed complementary imaging techniques that enable direct nanoscale observations in environmentally relevant conditions: cryogenic TEM and in-situ liquid cellmore » TEM. We observed a three-fold increase in aggregate size of goethite nanoparticles upon addition of a model organic phosphate ligand and quantification of nanoparticle orientation reveals a preference for side-to-side interactions independent of the addition of an organic ligand. Additionally, in-situ liquid cell TEM experiments provides a dynamic view of the interactions allowing us to report velocities of mineral assemblages during aggregation and disaggregation, which could potentially provide binding energetics and kinetic parameters about organic-mineral and mineral-mineral systems.« less

  1. Studies Introducing Costimulation Blockade for Vascularized Composite Allografts in Non-Human Primates

    PubMed Central

    Freitas, AM; Samy, KP; Farris, AB; Leopardi, FV; Song, M; Stempora, L; Strobert, EA; Jenkins, JA; Kirk, AD; Cendales, LC

    2016-01-01

    Vascularized composite allografts (VCAs) are technically feasible. Similar to other organ transplants, VCAs are hampered by the toxicity and incomplete efficacy associated with conventional immunosuppression. Complications attributable to calcineurin inhibitors remain prevalent in the clinical cases reported to date, and these loom particularly large given the non-lifesaving nature of VCAs. Additionally, acute rejection remains almost ubiquitous, albeit controllable with current agents. Costimulation blockade offers the potential to provide prophylaxis from rejection without the adverse consequences of calcineurin-based regimens. In this study, we used a non-human-primate model of VCA in conjunction with immunosuppressive regimens containing combinations of B7-specific costimulation blockade with and without adhesion blockade with LFA3-Ig to determine what adjunctive role these agents could play in VCA transplantation when combined with more conventional agents. Compared to tacrolimus, the addition of belatacept improved rejection free allograft survival. The combination with LFA3-Ig reduced CD2hi memory T cells, however did not provide additional protection against allograft rejection and hindered protective immunity. Histology paralleled clinical histopathology and Banff grading. These data provide the basis for the study of costimulation blockade in VCA in a relevant preclinical model. PMID:26139552

  2. Constraints on holographic cosmologies from strong lensing systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cárdenas, Víctor H.; Bonilla, Alexander; Motta, Verónica

    We use strongly gravitationally lensed (SGL) systems to put additional constraints on a set of holographic dark energy models. Data available in the literature (redshift and velocity dispersion) is used to obtain the Einstein radius and compare it with model predictions. We found that the ΛCDM is the best fit to the data. Although a preliminary statistical analysis seems to indicate that two of the holographic models studied show interesting agreement with observations, a stringent test lead us to the result that neither of the holographic models are competitive with the ΛCDM. These results highlight the importance of Strong Lensingmore » measurements to provide additional observational constraints to alternative cosmological models, which are necessary to shed some light into the dark universe.« less

  3. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  4. Additively Manufactured IN718 Components with Wirelessly Powered and Interrogated Embedded Sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Attridge, Paul; Bajekal, Sanjay; Klecka, Michael

    A methodology is described for embedding commercial-off-the-shelf sensors together with wireless communication and power circuit elements using direct laser metal sintered additively manufactured components. Physics based models of the additive manufacturing processes and sensor/wireless level performance models guided the design and embedment processes. A combination of cold spray deposition and laser engineered net shaping was used to fashion the transmitter/receiving elements and embed the sensors, thereby providing environmental protection and component robustness/survivability for harsh conditions. By design, this complement of analog and digital sensors were wirelessly powered and interrogated using a health and utilization monitoring system; enabling real-time, in situmore » prognostics and diagnostics.« less

  5. Thinking Developmentally: The Next Evolution in Models of Health.

    PubMed

    Garner, Andrew S

    2016-09-01

    As the basic sciences that inform conceptions of human health advance, so must the models that are used to frame additional research, to teach the next generation of providers, and to inform health policy. This article briefly reviews the evolution from a biomedical model to a biopsychosocial (BPS) model and to an ecobiodevelopmental (EBD) model. Like the BPS model, the EBD model reaffirms the biological significance of psychosocial features within the patient's ecology, but it does so at the molecular and cellular levels. More importantly, the EBD model adds the dimension of time, forcing providers to "think developmentally" and to acknowledge the considerable biological and psychological consequences of previous experiences. For the health care system to move from a reactive "sick care" system to a proactive "well care" system, all providers must begin thinking developmentally by acknowledging the dynamic but cumulative dance between nature and nurture that drives development, behavior, and health, not only in childhood, but across the lifespan.

  6. Stability and value of male care for offspring: is it worth only half the trouble?

    PubMed

    Fromhage, Lutz; McNamara, John M; Houston, Alasdair I

    2007-06-22

    Models of parental investment often assume a trade-off for males between providing care and seeking additional mating opportunities. It is not obvious, however, how such additional matings should be accounted for in a consistent population model, because deserting males might increase their fertilization success at the cost of either caring males, other deserting males or both. Here, we present a game theory model that addresses all of these possibilities in a general way. In contrast to earlier work, we find that the source of deserting males' additional matings is irrelevant to the evolutionary stability of male care. We reject the claim that fitness gains through male care are intrinsically less valuable than those through desertion, and that the former must therefore be down-weighted by 1/2 when compared with the latter.

  7. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.

  8. Determining relative error bounds for the CVBEM

    USGS Publications Warehouse

    Hromadka, T.V.

    1985-01-01

    The Complex Variable Boundary Element Methods provides a measure of relative error which can be utilized to subsequently reduce the error or provide information for further modeling analysis. By maximizing the relative error norm on each boundary element, a bound on the total relative error for each boundary element can be evaluated. This bound can be utilized to test CVBEM convergence, to analyze the effects of additional boundary nodal points in reducing the modeling error, and to evaluate the sensitivity of resulting modeling error within a boundary element from the error produced in another boundary element as a function of geometric distance. ?? 1985.

  9. On a two-particle bound system on the half-line

    NASA Astrophysics Data System (ADS)

    Kerner, Joachim; Mühlenbruch, Tobias

    2017-10-01

    In this paper we provide an extension of the model discussed in [10] describing two singularly interacting particles on the half-line ℝ+. In this model, the particles are interacting only whenever at least one particle is situated at the origin. Stimulated by [11] we then provide a generalisation of this model in order to include additional interactions between the particles leading to a molecular-like state. We give a precise mathematical formulation of the Hamiltonian of the system and perform spectral analysis. In particular, we are interested in the effect of the singular two-particle interactions onto the molecule.

  10. The NASA Marshall Space Flight Center Earth Global Reference Atmospheric Model-2010 Version

    NASA Technical Reports Server (NTRS)

    Leslie, F. W.; Justus, C. G.

    2011-01-01

    Reference or standard atmospheric models have long been used for design and mission planning of various aerospace systems. The NASA Marshall Space Flight Center Global Reference Atmospheric Model was developed in response to the need for a design reference atmosphere that provides complete global geographical variability and complete altitude coverage (surface to orbital altitudes), as well as complete seasonal and monthly variability of the thermodynamic variables and wind components. In addition to providing the geographical, height, and monthly variation of the mean atmospheric state, it includes the ability to simulate spatial and temporal perturbations.

  11. Implications of complete watershed soil moisture measurements to hydrologic modeling

    NASA Technical Reports Server (NTRS)

    Engman, E. T.; Jackson, T. J.; Schmugge, T. J.

    1983-01-01

    A series of six microwave data collection flights for measuring soil moisture were made over a small 7.8 square kilometer watershed in southwestern Minnesota. These flights were made to provide 100 percent coverage of the basin at a 400 m resolution. In addition, three flight lines were flown at preselected areas to provide a sample of data at a higher resolution of 60 m. The low level flights provide considerably more information on soil moisture variability. The results are discussed in terms of reproducibility, spatial variability and temporal variability, and their implications for hydrologic modeling.

  12. Importance of Personalized Health-Care Models: A Case Study in Activity Recognition.

    PubMed

    Zdravevski, Eftim; Lameski, Petre; Trajkovik, Vladimir; Pombo, Nuno; Garcia, Nuno

    2018-01-01

    Novel information and communication technologies create possibilities to change the future of health care. Ambient Assisted Living (AAL) is seen as a promising supplement of the current care models. The main goal of AAL solutions is to apply ambient intelligence technologies to enable elderly people to continue to live in their preferred environments. Applying trained models from health data is challenging because the personalized environments could differ significantly than the ones which provided training data. This paper investigates the effects on activity recognition accuracy using single accelerometer of personalized models compared to models built on general population. In addition, we propose a collaborative filtering based approach which provides balance between fully personalized models and generic models. The results show that the accuracy could be improved to 95% with fully personalized models, and up to 91.6% with collaborative filtering based models, which is significantly better than common models that exhibit accuracy of 85.1%. The collaborative filtering approach seems to provide highly personalized models with substantial accuracy, while overcoming the cold start problem that is common for fully personalized models.

  13. Application of System Operational Effectiveness Methodology to Space Launch Vehicle Development and Operations

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Kelley, Gary W.

    2012-01-01

    The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.

  14. Three-dimensional cell culture models for investigating human viruses.

    PubMed

    He, Bing; Chen, Guomin; Zeng, Yi

    2016-10-01

    Three-dimensional (3D) culture models are physiologically relevant, as they provide reproducible results, experimental flexibility and can be adapted for high-throughput experiments. Moreover, these models bridge the gap between traditional two-dimensional (2D) monolayer cultures and animal models. 3D culture systems have significantly advanced basic cell science and tissue engineering, especially in the fields of cell biology and physiology, stem cell research, regenerative medicine, cancer research, drug discovery, and gene and protein expression studies. In addition, 3D models can provide unique insight into bacteriology, virology, parasitology and host-pathogen interactions. This review summarizes and analyzes recent progress in human virological research with 3D cell culture models. We discuss viral growth, replication, proliferation, infection, virus-host interactions and antiviral drugs in 3D culture models.

  15. Some considerations on the use of ecological models to predict species' geographic distributions

    USGS Publications Warehouse

    Peterjohn, B.G.

    2001-01-01

    Peterson (2001) used Genetic Algorithm for Rule-set Prediction (GARP) models to predict distribution patterns from Breeding Bird Survey (BBS) data. Evaluations of these models should consider inherent limitations of BBS data: (1) BBS methods may not sample species and habitats equally; (2) using BBS data for both model development and testing may overlook poor fit of some models; and (3) BBS data may not provide the desired spatial resolution or capture temporal changes in species distributions. The predictive value of GARP models requires additional study, especially comparisons with distribution patterns from independent data sets. When employed at appropriate temporal and geographic scales, GARP models show considerable promise for conservation biology applications but provide limited inferences concerning processes responsible for the observed patterns.

  16. Atmospheric numerical modeling resource enhancement and model convective parameterization/scale interaction studies

    NASA Technical Reports Server (NTRS)

    Cushman, Paula P.

    1993-01-01

    Research will be undertaken in this contract in the area of Modeling Resource and Facilities Enhancement to include computer, technical and educational support to NASA investigators to facilitate model implementation, execution and analysis of output; to provide facilities linking USRA and the NASA/EADS Computer System as well as resident work stations in ESAD; and to provide a centralized location for documentation, archival and dissemination of modeling information pertaining to NASA's program. Additional research will be undertaken in the area of Numerical Model Scale Interaction/Convective Parameterization Studies to include implementation of the comparison of cloud and rain systems and convective-scale processes between the model simulations and what was observed; and to incorporate the findings of these and related research findings in at least two refereed journal articles.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchat, Thomas K.; Jernigan, Dann A.

    A set of experiments and test data are outlined in this report that provides radiation intensity data for the validation of models for the radiative transfer equation. The experiments were performed with lightly-sooting liquid hydrocarbon fuels that yielded fully turbulent fires 2 m diameter). In addition, supplemental measurements of air flow and temperature, fuel temperature and burn rate, and flame surface emissive power, wall heat, and flame height and width provide a complete set of boundary condition data needed for validation of models used in fire simulations.

  18. Large-scale Watershed Modeling: NHDPlus Resolution with Achievable Conservation Scenarios in the Western Lake Erie Basin

    NASA Astrophysics Data System (ADS)

    Yen, H.; White, M. J.; Arnold, J. G.; Keitzer, S. C.; Johnson, M. V. V.; Atwood, J. D.; Daggupati, P.; Herbert, M. E.; Sowa, S. P.; Ludsin, S.; Robertson, D. M.; Srinivasan, R.; Rewa, C. A.

    2016-12-01

    By the substantial improvement of computer technology, large-scale watershed modeling has become practically feasible in conducting detailed investigations of hydrologic, sediment, and nutrient processes. In the Western Lake Erie Basin (WLEB), water quality issues caused by anthropogenic activities are not just interesting research subjects but, have implications related to human health and welfare, as well as ecological integrity, resistance, and resilience. In this study, the Soil and Water Assessment Tool (SWAT) and the finest resolution stream network, NHDPlus, were implemented on the WLEB to examine the interactions between achievable conservation scenarios with corresponding additional projected costs. During the calibration/validation processes, both hard (temporal) and soft (non-temporal) data were used to ensure the modeling outputs are coherent with actual watershed behavior. The results showed that widespread adoption of conservation practices intended to provide erosion control could deliver average reductions of sediment and nutrients without additional nutrient management changes. On the other hand, responses of nitrate (NO3) and dissolved inorganic phosphorus (DIP) dynamics may be different than responses of total nitrogen and total phosphorus dynamics under the same conservation practice. Model results also implied that fewer financial resources are required to achieve conservation goals if the goal is to achieve reductions in targeted watershed outputs (ex. NO3 or DIP) rather than aggregated outputs (ex. total nitrogen or total phosphorus). In addition, it was found that the model's capacity to simulate seasonal effects and responses to changing conservation adoption on a seasonal basis could provide a useful index to help alleviate additional cost through temporal targeting of conservation practices. Scientists, engineers, and stakeholders can take advantage of the work performed in this study as essential information while conducting policy making processes in the future.

  19. Using Geometry-Based Metrics as Part of Fitness-for-Purpose Evaluations of 3D City Models

    NASA Astrophysics Data System (ADS)

    Wong, K.; Ellul, C.

    2016-10-01

    Three-dimensional geospatial information is being increasingly used in a range of tasks beyond visualisation. 3D datasets, however, are often being produced without exact specifications and at mixed levels of geometric complexity. This leads to variations within the models' geometric and semantic complexity as well as the degree of deviation from the corresponding real world objects. Existing descriptors and measures of 3D data such as CityGML's level of detail are perhaps only partially sufficient in communicating data quality and fitness-for-purpose. This study investigates whether alternative, automated, geometry-based metrics describing the variation of complexity within 3D datasets could provide additional relevant information as part of a process of fitness-for-purpose evaluation. The metrics include: mean vertex/edge/face counts per building; vertex/face ratio; minimum 2D footprint area and; minimum feature length. Each metric was tested on six 3D city models from international locations. The results show that geometry-based metrics can provide additional information on 3D city models as part of fitness-for-purpose evaluations. The metrics, while they cannot be used in isolation, may provide a complement to enhance existing data descriptors if backed up with local knowledge, where possible.

  20. Digital hydrologic networks supporting applications related to spatially referenced regression modeling

    USGS Publications Warehouse

    Brakebill, John W.; Wolock, David M.; Terziotti, Silvia

    2011-01-01

    Digital hydrologic networks depicting surface-water pathways and their associated drainage catchments provide a key component to hydrologic analysis and modeling. Collectively, they form common spatial units that can be used to frame the descriptions of aquatic and watershed processes. In addition, they provide the ability to simulate and route the movement of water and associated constituents throughout the landscape. Digital hydrologic networks have evolved from derivatives of mapping products to detailed, interconnected, spatially referenced networks of water pathways, drainage areas, and stream and watershed characteristics. These properties are important because they enhance the ability to spatially evaluate factors that affect the sources and transport of water-quality constituents at various scales. SPAtially Referenced Regressions On Watershed attributes (SPARROW), a process-based ⁄ statistical model, relies on a digital hydrologic network in order to establish relations between quantities of monitored contaminant flux, contaminant sources, and the associated physical characteristics affecting contaminant transport. Digital hydrologic networks modified from the River Reach File (RF1) and National Hydrography Dataset (NHD) geospatial datasets provided frameworks for SPARROW in six regions of the conterminous United States. In addition, characteristics of the modified RF1 were used to update estimates of mean-annual streamflow. This produced more current flow estimates for use in SPARROW modeling.

  1. Reconstruction of f(T)-gravity in the absence of matter

    NASA Astrophysics Data System (ADS)

    El Hanafy, W.; Nashed, G. G. L.

    2016-06-01

    We derive an exact f(T) gravity in the absence of ordinary matter in Friedmann-Robertson-Walker (FRW) universe, where T is the teleparallel torsion scalar. We show that vanishing of the energy-momentum tensor {T}^{μ ν } of matter does not imply vanishing of the teleparallel torsion scalar, in contrast to general relativity, where the Ricci scalar vanishes. The theory provides an exponential ( inflationary) scale factor independent of the choice of the sectional curvature. In addition, the obtained f(T) acts just like cosmological constant in the flat space model. Nevertheless, it is dynamical in non-flat models. In particular, the open universe provides a decaying pattern of the f(T) contributing directly to solve the fine-tuning problem of the cosmological constant. The equation of state (EoS) of the torsion vacuum fluid has been studied in positive and negative Hubble regimes. We study the case when the torsion is made of a scalar field ( tlaplon) which acts as torsion potential. This treatment enables to induce a tlaplon field sensitive to the symmetry of the spacetime in addition to the reconstruction of its effective potential from the f(T) theory. The theory provides six different versions of inflationary models. The real solutions are mainly quadratic, the complex solutions, remarkably, provide Higgs-like potential.

  2. A Literature Survey and Experimental Evaluation of the State-of-the-Art in Uplift Modeling: A Stepping Stone Toward the Development of Prescriptive Analytics.

    PubMed

    Devriendt, Floris; Moldovan, Darie; Verbeke, Wouter

    2018-03-01

    Prescriptive analytics extends on predictive analytics by allowing to estimate an outcome in function of control variables, allowing as such to establish the required level of control variables for realizing a desired outcome. Uplift modeling is at the heart of prescriptive analytics and aims at estimating the net difference in an outcome resulting from a specific action or treatment that is applied. In this article, a structured and detailed literature survey on uplift modeling is provided by identifying and contrasting various groups of approaches. In addition, evaluation metrics for assessing the performance of uplift models are reviewed. An experimental evaluation on four real-world data sets provides further insight into their use. Uplift random forests are found to be consistently among the best performing techniques in terms of the Qini and Gini measures, although considerable variability in performance across the various data sets of the experiments is observed. In addition, uplift models are frequently observed to be unstable and display a strong variability in terms of performance across different folds in the cross-validation experimental setup. This potentially threatens their actual use for business applications. Moreover, it is found that the available evaluation metrics do not provide an intuitively understandable indication of the actual use and performance of a model. Specifically, existing evaluation metrics do not facilitate a comparison of uplift models and predictive models and evaluate performance either at an arbitrary cutoff or over the full spectrum of potential cutoffs. In conclusion, we highlight the instability of uplift models and the need for an application-oriented approach to assess uplift models as prime topics for further research.

  3. Approximations to camera sensor noise

    NASA Astrophysics Data System (ADS)

    Jin, Xiaodan; Hirakawa, Keigo

    2013-02-01

    Noise is present in all image sensor data. Poisson distribution is said to model the stochastic nature of the photon arrival process, while it is common to approximate readout/thermal noise by additive white Gaussian noise (AWGN). Other sources of signal-dependent noise such as Fano and quantization also contribute to the overall noise profile. Question remains, however, about how best to model the combined sensor noise. Though additive Gaussian noise with signal-dependent noise variance (SD-AWGN) and Poisson corruption are two widely used models to approximate the actual sensor noise distribution, the justification given to these types of models are based on limited evidence. The goal of this paper is to provide a more comprehensive characterization of random noise. We concluded by presenting concrete evidence that Poisson model is a better approximation to real camera model than SD-AWGN. We suggest further modification to Poisson that may improve the noise model.

  4. Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?

    PubMed Central

    Zuberi, Aamir; Lutz, Cathleen

    2016-01-01

    Abstract The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling systems, are synergistic and serve to make the mouse a better model for biomedical research, enhancing the potential for preclinical drug discovery and personalized medicine. PMID:28053071

  5. Spatial structuring within a reservoir fish population: implications for management

    USGS Publications Warehouse

    Stewart, David R.; Long, James M.; Shoup, Daniel E.

    2014-01-01

    Spatial structuring in reservoir fish populations can exist because of environmental gradients, species-specific behaviour, or even localised fishing effort. The present study investigated whether white crappie exhibited evidence of improved population structure where the northern more productive half of a lake is closed to fishing to provide waterfowl hunting opportunities. Population response to angling was modelled for each substock of white crappie (north (protected) and south (unprotected) areas), the entire lake (single-stock model) and by combining simulations of the two independent substock models (additive model). White crappie in the protected area were more abundant, consisting of larger, older individuals, and exhibited a lower total annual mortality rate than in the unprotected area. Population modelling found that fishing mortality rates between 0.1 and 0.3 resulted in sustainable populations (spawning potential ratios (SPR) >0.30). The population in the unprotected area appeared to be more resilient (SPR > 0.30) at the higher fishing intensities (0.35–0.55). Considered additively, the whole-lake fishery appeared more resilient than when modelled as a single-panmictic stock. These results provided evidence of spatial structuring in reservoir fish populations, and we recommend model assessments used to guide management decisions should consider those spatial differences in other populations where they exist.

  6. Flood damage estimation of companies: A comparison of Stage-Damage-Functions and Random Forests

    NASA Astrophysics Data System (ADS)

    Sieg, Tobias; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2017-04-01

    The development of appropriate flood damage models plays an important role not only for the damage assessment after an event but also to develop adaptation and risk mitigation strategies. So called Stage-Damage-Functions (SDFs) are often applied as a standard approach to estimate flood damage. These functions assign a certain damage to the water depth depending on the use or other characteristics of the exposed objects. Recent studies apply machine learning algorithms like Random Forests (RFs) to model flood damage. These algorithms usually consider more influencing variables and promise to depict a more detailed insight into the damage processes. In addition they provide an inherent validation scheme. Our study focuses on direct, tangible damage of single companies. The objective is to model and validate the flood damage suffered by single companies with SDFs and RFs. The data sets used are taken from two surveys conducted after the floods in the Elbe and Danube catchments in the years 2002 and 2013 in Germany. Damage to buildings (n = 430), equipment (n = 651) as well as goods and stock (n = 530) are taken into account. The model outputs are validated via a comparison with the actual flood damage acquired by the surveys and subsequently compared with each other. This study investigates the gain in model performance with the use of additional data and the advantages and disadvantages of the RFs compared to SDFs. RFs show an increase in model performance with an increasing amount of data records over a comparatively large range, while the model performance of the SDFs is already saturated for a small set of records. In addition, the RFs are able to identify damage influencing variables, which improves the understanding of damage processes. Hence, RFs can slightly improve flood damage predictions and provide additional insight into the underlying mechanisms compared to SDFs.

  7. Clinical and multiple gene expression variables in survival analysis of breast cancer: Analysis with the hypertabastic survival model

    PubMed Central

    2012-01-01

    Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496

  8. High energy seesaw models, GUTs and Leptogenesis

    NASA Astrophysics Data System (ADS)

    Di Bari, Pasquale

    2017-09-01

    I review high energy (type I) seesaw models and in particular how they can be nicely embedded within grand-unified models and reproduce the observed matter-antimatter asymmetry with leptogenesis. I also discuss how high energy (type I) seesaw models can provide a candidate for very heavy cold dark matter, within the TeV-EeV range, whose decays might explain part of the IceCube high energy neutrino events in addition to an astrophysical component.

  9. Demonstrating the Relationship between School Nurse Workload and Student Outcomes

    ERIC Educational Resources Information Center

    Daughtry, Donna; Engelke, Martha Keehner

    2018-01-01

    This article describes how one very large, diverse school district developed a Student Acuity Tool for School Nurse Assignment and used a logic model to successfully advocate for additional school nurse positions. The logic model included three student outcomes that were evaluated: provide medications and procedures safely and accurately, increase…

  10. Evaluation of Reliability Coefficients for Two-Level Models via Latent Variable Analysis

    ERIC Educational Resources Information Center

    Raykov, Tenko; Penev, Spiridon

    2010-01-01

    A latent variable analysis procedure for evaluation of reliability coefficients for 2-level models is outlined. The method provides point and interval estimates of group means' reliability, overall reliability of means, and conditional reliability. In addition, the approach can be used to test simple hypotheses about these parameters. The…

  11. Dynamics of buckbrush populations under simulated forest restoration alternatives

    Treesearch

    David W. Huffman; Margaret M. Moore

    2008-01-01

    Plant population models are valuable tools for assessing ecological tradeoffs between forest management approaches. In addition, these models can provide insight on plant life history patterns and processes important for persistence and recovery of populations in changing environments. In this study, we evaluated a set of ecological restoration alternatives for their...

  12. Dynamics of buckbrush populations under simulated forest restoration alternatives (P-53)

    Treesearch

    David W. Huffman; Margaret M. Moore

    2008-01-01

    Plant population models are valuable tools for assessing ecological tradeoffs between forest management approaches. In addition, these models can provide insight on plant life history patterns and processes important for persistence and recovery of populations in changing environments. In this study, we evaluated a set of ecological restoration alternatives for their...

  13. 3-D and quasi-2-D discrete element modeling of grain commingling in a bucket elevator boot system

    USDA-ARS?s Scientific Manuscript database

    Unwanted grain commingling impedes new quality-based grain handling systems and has proven to be an expensive and time consuming issue to study experimentally. Experimentally validated models may reduce the time and expense of studying grain commingling while providing additional insight into detail...

  14. Career Education Facilities: A Planning Guide for Space and Station Requirements. A Report

    ERIC Educational Resources Information Center

    Woodruff, Alan P.

    This publication provides the educational planner and the architect with some suggestions concerning models by which they may plan new flexible-use, shared-space facilities and supports the models with guidelines for the development of facilities and educational programs for occupational education. In addition to discussing the financial…

  15. Best Longitudinal Adjustment of Satellite Trajectories for the Observation of Forest Fires (Blastoff): A Stochastic Programming Approach to Satellite System Design

    NASA Astrophysics Data System (ADS)

    Hoskins, Aaron B.

    Forest fires cause a significant amount of damage and destruction each year. Optimally dispatching resources reduces the amount of damage a forest fire can cause. Models predict the fire spread to provide the data required to optimally dispatch resources. However, the models are only as accurate as the data used to build them. Satellites are one valuable tool in the collection of data for the forest fire models. Satellites provide data on the types of vegetation, the wind speed and direction, the soil moisture content, etc. The current operating paradigm is to passively collect data when possible. However, images from directly overhead provide better resolution and are easier to process. Maneuvering a constellation of satellites to fly directly over the forest fire provides higher quality data than is achieved with the current operating paradigm. Before launch, the location of the forest fire is unknown. Therefore, it is impossible to optimize the initial orbits for the satellites. Instead, the expected cost of maneuvering to observe the forest fire determines the optimal initial orbits. A two-stage stochastic programming approach is well suited for this class of problem where initial decisions are made with an uncertain future and then subsequent decisions are made once a scenario is realized. A repeat ground track orbit provides a non-maneuvering, natural solution providing a daily flyover of the forest fire. However, additional maneuvers provide a second daily flyover of the forest fire. The additional maneuvering comes at a significant cost in terms of additional fuel, but provides more data collection opportunities. After data are collected, ground stations receive the data for processing. Optimally selecting the ground station locations reduce the number of built ground stations and reduces the data fusion issues. However, the location of the forest fire alters the optimal ground station sites. A two-stage stochastic programming approach optimizes the selection of ground stations to maximize the expected amount of data downloaded from a satellite. The approaches of selecting initial orbits and ground station locations including uncertainty will provide a robust system to reduce the amount of damage caused by forest fires.

  16. Quality Assurance in Higher Education: A Review of Literature

    ERIC Educational Resources Information Center

    Ryan, Tricia

    2015-01-01

    This paper examines the literature surrounding quality assurance in global higher education. It provides an overview of accreditation as a mechanism to ensure quality in higher education, examines models of QA, and explores the concept of quality (including definitions of quality and quality assurance). In addition, this paper provides a review of…

  17. Statistical properties of a filtered Poisson process with additive random noise: distributions, correlations and moment estimation

    NASA Astrophysics Data System (ADS)

    Theodorsen, A.; E Garcia, O.; Rypdal, M.

    2017-05-01

    Filtered Poisson processes are often used as reference models for intermittent fluctuations in physical systems. Such a process is here extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The lowest order moments, probability density function, auto-correlation function and power spectral density are derived and used to identify and compare the effects of the two different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of model parameter estimation and to identify methods for distinguishing the noise types. It is shown that the probability density function and the three lowest order moments provide accurate estimations of the model parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of identifying the noise type. The number of times the signal crosses a prescribed threshold level in the positive direction also promises to be able to differentiate the noise type.

  18. Revisiting a model of ontogenetic growth: estimating model parameters from theory and data.

    PubMed

    Moses, Melanie E; Hou, Chen; Woodruff, William H; West, Geoffrey B; Nekola, Jeffery C; Zuo, Wenyun; Brown, James H

    2008-05-01

    The ontogenetic growth model (OGM) of West et al. provides a general description of how metabolic energy is allocated between production of new biomass and maintenance of existing biomass during ontogeny. Here, we reexamine the OGM, make some minor modifications and corrections, and further evaluate its ability to account for empirical variation on rates of metabolism and biomass in vertebrates both during ontogeny and across species of varying adult body size. We show that the updated version of the model is internally consistent and is consistent with other predictions of metabolic scaling theory and empirical data. The OGM predicts not only the near universal sigmoidal form of growth curves but also the M(1/4) scaling of the characteristic times of ontogenetic stages in addition to the curvilinear decline in growth efficiency described by Brody. Additionally, the OGM relates the M(3/4) scaling across adults of different species to the scaling of metabolic rate across ontogeny within species. In providing a simple, quantitative description of how energy is allocated to growth, the OGM calls attention to unexplained variation, unanswered questions, and opportunities for future research.

  19. Animal models of polymicrobial pneumonia

    PubMed Central

    Hraiech, Sami; Papazian, Laurent; Rolain, Jean-Marc; Bregeon, Fabienne

    2015-01-01

    Pneumonia is one of the leading causes of severe and occasionally life-threatening infections. The physiopathology of pneumonia has been extensively studied, providing information for the development of new treatments for this condition. In addition to in vitro research, animal models have been largely used in the field of pneumonia. Several models have been described and have provided a better understanding of pneumonia under different settings and with various pathogens. However, the concept of one pathogen leading to one infection has been challenged, and recent flu epidemics suggest that some pathogens exhibit highly virulent potential. Although “two hits” animal models have been used to study infectious diseases, few of these models have been described in pneumonia. Therefore the aims of this review were to provide an overview of the available literature in this field, to describe well-studied and uncommon pathogen associations, and to summarize the major insights obtained from this information. PMID:26170617

  20. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  1. Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.

    PubMed

    Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio

    2010-03-26

    Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.

  2. Modeling patients' acceptance of provider-delivered e-health.

    PubMed

    Wilson, E Vance; Lankton, Nancy K

    2004-01-01

    Health care providers are beginning to deliver a range of Internet-based services to patients; however, it is not clear which of these e-health services patients need or desire. The authors propose that patients' acceptance of provider-delivered e-health can be modeled in advance of application development by measuring the effects of several key antecedents to e-health use and applying models of acceptance developed in the information technology (IT) field. This study tested three theoretical models of IT acceptance among patients who had recently registered for access to provider-delivered e-health. An online questionnaire administered items measuring perceptual constructs from the IT acceptance models (intrinsic motivation, perceived ease of use, perceived usefulness/extrinsic motivation, and behavioral intention to use e-health) and five hypothesized antecedents (satisfaction with medical care, health care knowledge, Internet dependence, information-seeking preference, and health care need). Responses were collected and stored in a central database. All tested IT acceptance models performed well in predicting patients' behavioral intention to use e-health. Antecedent factors of satisfaction with provider, information-seeking preference, and Internet dependence uniquely predicted constructs in the models. Information technology acceptance models provide a means to understand which aspects of e-health are valued by patients and how this may affect future use. In addition, antecedents to the models can be used to predict e-health acceptance in advance of system development.

  3. Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2008-01-01

    At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.

  4. A corrected formulation for marginal inference derived from two-part mixed models for longitudinal semi-continuous data.

    PubMed

    Tom, Brian Dm; Su, Li; Farewell, Vernon T

    2016-10-01

    For semi-continuous data which are a mixture of true zeros and continuously distributed positive values, the use of two-part mixed models provides a convenient modelling framework. However, deriving population-averaged (marginal) effects from such models is not always straightforward. Su et al. presented a model that provided convenient estimation of marginal effects for the logistic component of the two-part model but the specification of marginal effects for the continuous part of the model presented in that paper was based on an incorrect formulation. We present a corrected formulation and additionally explore the use of the two-part model for inferences on the overall marginal mean, which may be of more practical relevance in our application and more generally. © The Author(s) 2013.

  5. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  6. First- and Second-Line Bevacizumab in Addition to Chemotherapy for Metastatic Colorectal Cancer: A United States–Based Cost-Effectiveness Analysis

    PubMed Central

    Goldstein, Daniel A.; Chen, Qiushi; Ayer, Turgay; Howard, David H.; Lipscomb, Joseph; El-Rayes, Bassel F.; Flowers, Christopher R.

    2015-01-01

    Purpose The addition of bevacizumab to fluorouracil-based chemotherapy is a standard of care for previously untreated metastatic colorectal cancer. Continuation of bevacizumab beyond progression is an accepted standard of care based on a 1.4-month increase in median overall survival observed in a randomized trial. No United States–based cost-effectiveness modeling analyses are currently available addressing the use of bevacizumab in metastatic colorectal cancer. Our objective was to determine the cost effectiveness of bevacizumab in the first-line setting and when continued beyond progression from the perspective of US payers. Methods We developed two Markov models to compare the cost and effectiveness of fluorouracil, leucovorin, and oxaliplatin with or without bevacizumab in the first-line treatment and subsequent fluorouracil, leucovorin, and irinotecan with or without bevacizumab in the second-line treatment of metastatic colorectal cancer. Model robustness was addressed by univariable and probabilistic sensitivity analyses. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Results Using bevacizumab in first-line therapy provided an additional 0.10 QALYs (0.14 life-years) at a cost of $59,361. The incremental cost-effectiveness ratio was $571,240 per QALY. Continuing bevacizumab beyond progression provided an additional 0.11 QALYs (0.16 life-years) at a cost of $39,209. The incremental cost-effectiveness ratio was $364,083 per QALY. In univariable sensitivity analyses, the variables with the greatest influence on the incremental cost-effectiveness ratio were bevacizumab cost, overall survival, and utility. Conclusion Bevacizumab provides minimal incremental benefit at high incremental cost per QALY in both the first- and second-line settings of metastatic colorectal cancer treatment. PMID:25691669

  7. A Comprehensive, Model-Based Review of Vaccine and Repeat Infection Trials for Filariasis

    PubMed Central

    Morris, C. Paul; Evans, Holly; Larsen, Sasha E.

    2013-01-01

    SUMMARY Filarial worms cause highly morbid diseases such as elephantiasis and river blindness. Since the 1940s, researchers have conducted vaccine trials in 27 different animal models of filariasis. Although no vaccine trial in a permissive model of filariasis has provided sterilizing immunity, great strides have been made toward developing vaccines that could block transmission, decrease pathological sequelae, or decrease susceptibility to infection. In this review, we have organized, to the best of our ability, all published filaria vaccine trials and reviewed them in the context of the animal models used. Additionally, we provide information on the life cycle, disease phenotype, concomitant immunity, and natural immunity during primary and secondary infections for 24 different filaria models. PMID:23824365

  8. Wyoming greater sage-grouse habitat prioritization: A collection of multi-scale seasonal models and geographic information systems land management tools

    USGS Publications Warehouse

    O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.

    2015-01-01

    We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.

  9. Simulation of Attacks for Security in Wireless Sensor Network.

    PubMed

    Diaz, Alvaro; Sanchez, Pablo

    2016-11-18

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.

  10. Morphogenesis in bat wings: linking development, evolution and ecology.

    PubMed

    Adams, Rick A

    2008-01-01

    The evolution of powered flight in mammals required specific developmental shifts from an ancestral limb morphology to one adapted for flight. Through studies of comparative morphogenesis, investigators have quantified points and rates of divergence providing important insights into how wings evolved in mammals. Herein I compare growth,development and skeletogenesis of forelimbs between bats and the more ancestral state provided by the rat (Rattus norvegicus)and quantify growth trajectories that illustrate morphological divergence both developmentally and evolutionarily. In addition, I discuss how wing shape is controlled during morphogenesis by applying multivariate analyses of wing bones and wing membranes and discuss how flight dynamics are stabilized during flight ontogeny. Further, I discuss the development of flight in bats in relation to the ontogenetic niche and how juveniles effect populational foraging patterns. In addition, I provide a hypothetical ontogenetic landscape model that predicts how and when selection is most intense during juvenile morphogenesis and test this model with data from a population of the little brown bat, Myotis lucifugus. (c) 2007 S. Karger AG, Basel

  11. Impact of species delimitation and sampling on niche models and phylogeographical inference: A case study of the East African reed frog Hyperolius substriatus Ahl, 1931.

    PubMed

    Bittencourt-Silva, Gabriela B; Lawson, Lucinda P; Tolley, Krystal A; Portik, Daniel M; Barratt, Christopher D; Nagel, Peter; Loader, Simon P

    2017-09-01

    Ecological niche models (ENMs) have been used in a wide range of ecological and evolutionary studies. In biogeographic studies these models have, among other things, helped in the discovery of new allopatric populations, and even new species. However, small sample sizes and questionable taxonomic delimitation can challenge models, often decreasing their accuracy. Herein we examine the sensitivity of ENMs to the addition of new, geographically isolated populations, and the impact of applying different taxonomic delimitations. The East African reed frog Hyperolius substriatus Ahl, 1931 was selected as a case study because it has been the subject of previous ENM predictions. Our results suggest that addition of new data and reanalysis of species lineages of H. substriatus improved our understanding of the evolutionary history of this group of frogs. ENMs provided robust predictions, even when some populations were deliberately excluded from the models. Splitting the lineages based on genetic relationships and analysing the ENMs separately provided insights about the biogeographical processes that led to the current distribution of H. substriatus. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Simulating air temperature in an urban street canyon in all weather conditions using measured data at a reference meteorological station

    NASA Astrophysics Data System (ADS)

    Erell, E.; Williamson, T.

    2006-10-01

    A model is proposed that adapts data from a standard meteorological station to provide realistic site-specific air temperature in a city street exposed to the same meso-scale environment. In addition to a rudimentary description of the two sites, the canyon air temperature (CAT) model requires only inputs measured at standard weather stations; yet it is capable of accurately predicting the evolution of air temperature in all weather conditions for extended periods. It simulates the effect of urban geometry on radiant exchange; the effect of moisture availability on latent heat flux; energy stored in the ground and in building surfaces; air flow in the street based on wind above roof height; and the sensible heat flux from individual surfaces and from the street canyon as a whole. The CAT model has been tested on field data measured in a monitoring program carried out in Adelaide, Australia, in 2000-2001. After calibrating the model, predicted air temperature correlated well with measured data in all weather conditions over extended periods. The experimental validation provides additional evidence in support of a number of parameterisation schemes incorporated in the model to account for sensible heat and storage flux.

  13. Investigating melting induced mantle heterogeneities in plate driven mantle convection models

    NASA Astrophysics Data System (ADS)

    Price, M.; Davies, H.; Panton, J.

    2017-12-01

    Observations from geochemistry and seismology continue to suggest a range of complex heterogeneity in Earth's mantle. In the deep mantle, two large low velocity provinces (LLVPs) have been regularly observed in seismic studies, with their longevity, composition and density compared to the surrounding mantle debated. The cause of these observed LLVPs is equally uncertain, with previous studies advocating either thermal or thermo-chemical causes. There is also evidence that these structures could provide chemically distinct reservoirs within the mantle, with recent studies also suggesting there may be additional reservoirs in the mantle, such as bridgmanite-enriched ancient mantle structures (BEAMS). One way to test these hypotheses is using computational models of the mantle, with models that capture the full 3D system being both complex and computationally expensive. Here we present results from our global mantle model TERRA. Using our model, we can track compositional variations in the convecting mantle that are generated by self-consistent, evolving melting zones. Alongside the melting, we track trace elements and other volatiles which can be partitioned during melting events, and expelled and recycled at the surface. Utilising plate reconstruction models as a boundary condition, the models generate the tectonic features observed at Earth's surface, while also organising the lower mantle into recognisable degree-two structures. This results in our models generating basaltic `oceanic' crusts which are then brought into the mantle at tectonic boundaries, providing additional chemical heterogeneity in the mantle volume. Finally, by utilising thermodynamic lookup tables to convert the final outputs from the model to seismic structures, together with resolution filters for global tomography models, we are able to make direct comparisons between our results and observations. By varying the parameters of the model, we investigate a range of current hypotheses for heterogeneity in the mantle. Our work attempts to reconcile the many proposed current ideas for the deep mantle, giving additional insight from modelling on the latest observations from other Deep Earth disciplines.

  14. Glioblastoma, a brief review of history, molecular genetics, animal models and novel therapeutic strategies.

    PubMed

    Agnihotri, Sameer; Burrell, Kelly E; Wolf, Amparo; Jalali, Sharzhad; Hawkins, Cynthia; Rutka, James T; Zadeh, Gelareh

    2013-02-01

    Glioblastoma (GBM) is the most common and lethal primary brain tumor. Over the past few years tremendous genomic and proteomic characterization along with robust animal models of GBM have provided invaluable data that show that "GBM", although histologically indistinguishable from one another, are comprised of molecularly heterogenous diseases. In addition, robust pre-clinical models and a better understanding of the core pathways disrupted in GBM are providing a renewed optimism for novel strategies targeting these devastating tumors. Here, we summarize a brief history of the disease, our current molecular knowledge, lessons from animal models and emerging concepts of angiogenesis, invasion, and metabolism in GBM that may lend themselves to therapeutic targeting.

  15. Analysis of aeromedical retrieval coverage using elliptical isochrones: An evaluation of helicopter fleet size configurations in Scotland.

    PubMed

    Dodds, Naomi; Emerson, Philip; Phillips, Stephanie; Green, David R; Jansen, Jan O

    2017-03-01

    Trauma systems in remote and rural regions often rely on helicopter emergency medical services to facilitate access to definitive care. The siting of such resources is key, but often relies on simplistic modeling of coverage, using circular isochrones. Scotland is in the process of implementing a national trauma network, and there have been calls for an expansion of aeromedical retrieval capacity. The aim of this study was to analyze population and area coverage of the current retrieval service configuration, with three aircraft, and a configuration with an additional helicopter, in the North East of Scotland, using a novel methodology. Both overall coverage and coverage by physician-staffed aircraft, with enhanced clinical capability, were analyzed. This was a geographical analysis based on calculation of elliptical isochrones, which consider the "open-jaw" configuration of many retrieval flights. Helicopters are not always based at hospitals. We modeled coverage based on different outbound and inbound flights. Areally referenced population data were obtained from the Scottish Government. The current helicopter network configuration provides 94.2% population coverage and 59.0% area coverage. The addition of a fourth helicopter would marginally increase population coverage to 94.4% and area coverage to 59.1%. However, when considering only physician-manned aircraft, the current configuration provides only 71.7% population coverage and 29.4% area coverage, which would be increased to 91.1% and 51.2%, respectively, with a second aircraft. Scotland's current helicopter network configuration provides good population coverage for retrievals to major trauma centers, which would only be increased minimally by the addition of a fourth aircraft in the North East. The coverage provided by the single physician-staffed aircraft is more limited, however, and would be increased considerably by a second physician-staffed aircraft in the North East. Elliptical isochrones provide a useful means of modeling "open-jaw" retrieval missions and provide a more realistic estimate of coverage. Epidemiological study, level IV; therapeutic study, level IV.

  16. A systematic review of advance practice providers in acute care: options for a new model in a burn intensive care unit.

    PubMed

    Edkins, Renee E; Cairns, Bruce A; Hultman, C Scott

    2014-03-01

    Accreditation Council for Graduate Medical Education mandated work-hour restrictions have negatively impacted many areas of clinical care, including management of burn patients, who require intensive monitoring, resuscitation, and procedural interventions. As surgery residents become less available to meet service needs, new models integrating advanced practice providers (APPs) into the burn team must emerge. We performed a systematic review of APPs in critical care questioning, how best to use all providers to solve these workforce challenges? We performed a systematic review of PubMed, CINAHL, Ovid, and Google Scholar, from 2002 to 2012, using the key words: nurse practitioner, physician assistant, critical care, and burn care. After applying inclusion/exclusion criteria, 18 relevant articles were selected for review. In addition, throughput and financial models were developed to examine provider staffing patterns. Advanced practice providers in critical care settings function in various models, both with and without residents, reporting to either an intensivist or an attending physician. When APPs participated, patient outcomes were similar or improved compared across provider models. Several studies reported considerable cost-savings due to decrease length of stay, decreased ventilator days, and fewer urinary tract infections when nurse practitioners were included in the provider mix. Restrictions in resident work-hours and changing health care environments require that new provider models be created for acute burn care. This article reviews current utilization of APPs in critical care units and proposes a new provider model for burn centers.

  17. Precise Modelling of Telluric Features in Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Seifahrt, A.; Käufl, H. U.; Zängl, G.; Bean, J.; Richter, M.; Siebenmorgen, R.

    2010-12-01

    Ground-based astronomical observations suffer from the disturbing effects of the Earth's atmosphere. Oxygen, water vapour and a number of atmospheric trace gases absorb and emit light at discrete frequencies, shaping observing bands in the near- and mid-infrared and leaving their fingerprints - telluric absorption and emission lines - in astronomical spectra. The standard approach of removing the absorption lines is to observe a telluric standard star: a time-consuming and often imperfect solution. Alternatively, the spectral features of the Earth's atmosphere can be modelled using a radiative transfer code, often delivering a satisfying solution that removes these features without additional observations. In addition the model also provides a precise wavelength solution and an instrumental profile.

  18. Dust Around Herbig Ae Stars: Additional Constraints from their Photometric and Polarimetric Variability

    NASA Technical Reports Server (NTRS)

    Krivova, N. A.; Ilin, V. B.; Fischer, O.

    1996-01-01

    For the Herbig Ae stars with Algol-like minima (UX Ori, WW Vul, etc), the effects of circumstellar dust include: excess infrared emission, anomalous ultraviolet extinction, the 'blueing' of the stars in minima accompanying by an increase of intrinsic polarization. Using a Monte-Carlo code for polarized radiation transfer we have simulated these effects and compared the results obtained for different models with the observational data available. We found that the photometric and polarimetric behavior of the stars provided essential additional constraints on the circumstellar dust models. The models with spheroidal shell geometry and compact (non-fluffy) dust grains do not appear to be able to explain all the data.

  19. Thermal Analysis of Step 2 GPHS for Next Generation Radioisotope Power Source Missions

    NASA Astrophysics Data System (ADS)

    Pantano, David R.; Hill, Dennis H.

    2005-02-01

    The Step 2 General Purpose Heat Source (GPHS) is a slightly larger and more robust version of the heritage GPHS modules flown on previous Radioisotope Thermoelectric Generator (RTG) missions like Galileo, Ulysses, and Cassini. The Step 2 GPHS is to be used in future small radioisotope power sources, such as the Stirling Radioisotope Generator (SRG110) and the Multi-Mission Radioisotope Thermoelectric Generator (MMRTG). New features include an additional central web of Fine Weave Pierced Fabric (FWPF) graphite in the aeroshell between the two Graphite Impact Shells (GIS) to improve accidental reentry and impact survivability and an additional 0.1-inch of thickness to the aeroshell broad faces to improve ablation protection. This paper details the creation of the thermal model using Thermal Desktop and AutoCAD interfaces and provides comparisons of the model to results of previous thermal analysis models of the heritage GPHS. The results of the analysis show an anticipated decrease in total thermal gradient from the aeroshell to the iridium clads compared to the heritage results. In addition, the Step 2 thermal model is investigated under typical SRG110 boundary conditions, with cover gas and gravity environments included where applicable, to provide preliminary guidance for design of the generator. Results show that the temperatures of the components inside the GPHS remain within accepted design limits during all envisioned mission phases.

  20. seawaveQ: an R package providing a model and utilities for analyzing trends in chemical concentrations in streams with a seasonal wave (seawave) and adjustment for streamflow (Q) and other ancillary variables

    USGS Publications Warehouse

    Ryberg, Karen R.; Vecchia, Aldo V.

    2013-01-01

    The seawaveQ R package fits a parametric regression model (seawaveQ) to pesticide concentration data from streamwater samples to assess variability and trends. The model incorporates the strong seasonality and high degree of censoring common in pesticide data and users can incorporate numerous ancillary variables, such as streamflow anomalies. The model is fitted to pesticide data using maximum likelihood methods for censored data and is robust in terms of pesticide, stream location, and degree of censoring of the concentration data. This R package standardizes this methodology for trend analysis, documents the code, and provides help and tutorial information, as well as providing additional utility functions for plotting pesticide and other chemical concentration data.

  1. Method of sound synthesis

    DOEpatents

    Miner, Nadine E.; Caudell, Thomas P.

    2004-06-08

    A sound synthesis method for modeling and synthesizing dynamic, parameterized sounds. The sound synthesis method yields perceptually convincing sounds and provides flexibility through model parameterization. By manipulating model parameters, a variety of related, but perceptually different sounds can be generated. The result is subtle changes in sounds, in addition to synthesis of a variety of sounds, all from a small set of models. The sound models can change dynamically according to changes in the simulation environment. The method is applicable to both stochastic (impulse-based) and non-stochastic (pitched) sounds.

  2. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    NASA Technical Reports Server (NTRS)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  3. Modeling of optical quadrature microscopy for imaging mouse embryos

    NASA Astrophysics Data System (ADS)

    Warger, William C., II; DiMarzio, Charles A.

    2008-02-01

    Optical quadrature microscopy (OQM) has been shown to provide the optical path difference through a mouse embryo, and has led to a novel method to count the total number of cells further into development than current non-toxic imaging techniques used in the clinic. The cell counting method has the potential to provide an additional quantitative viability marker for blastocyst transfer during in vitro fertilization. OQM uses a 633 nm laser within a modified Mach-Zehnder interferometer configuration to measure the amplitude and phase of the signal beam that travels through the embryo. Four cameras preceded by multiple beamsplitters record the four interferograms that are used within a reconstruction algorithm to produce an image of the complex electric field amplitude. Here we present a model for the electric field through the primary optical components in the imaging configuration and the reconstruction algorithm to calculate the signal to noise ratio when imaging mouse embryos. The model includes magnitude and phase errors in the individual reference and sample paths, fixed pattern noise, and noise within the laser and detectors. This analysis provides the foundation for determining the imaging limitations of OQM and the basis to optimize the cell counting method in order to introduce additional quantitative viability markers.

  4. End-to-end modeling as part of an integrated research program in the Bering Sea

    NASA Astrophysics Data System (ADS)

    Punt, André E.; Ortiz, Ivonne; Aydin, Kerim Y.; Hunt, George L.; Wiese, Francis K.

    2016-12-01

    Traditionally, the advice provided to fishery managers has focused on the trade-offs between short- and long-term yields, and between future resource size and expected future catches. The harvest control rules that are used to provide management advice consequently relate catches to stock biomass levels expressed relative to reference biomass levels. There are, however, additional trade-offs. Ecosystem-based fisheries management (EBFM) aims to consider fish and fisheries in their ecological context, taking into account physical, biological, economic, and social factors. However, making EBFM operational remains challenging. It is generally recognized that end-to-end modeling should be a key part of implementing EBFM, along with harvest control rules that use information in addition to estimates of stock biomass to provide recommendations for management actions. Here we outline the process for selecting among alternative management strategies in an ecosystem context and summarize a Field-integrated End-To-End modeling program, or FETE, intended to implement this process as part of the Bering Sea Project. A key aspect of this project was that, from the start, the FETE included a management strategy evaluation component to compare management strategies. Effective use of end-to-end modeling requires that the models developed for a system are indeed integrated across climate drivers, lower trophic levels, fish population dynamics, and fisheries and their management. We summarize the steps taken by the program managers to promote integration of modeling efforts by multiple investigators and highlight the lessons learned during the project that can be used to guide future use and design of end-to-end models.

  5. The European ALMA Regional Centre: a model of user support

    NASA Astrophysics Data System (ADS)

    Andreani, P.; Stoehr, F.; Zwaan, M.; Hatziminaoglou, E.; Biggs, A.; Diaz-Trigo, M.; Humphreys, E.; Petry, D.; Randall, S.; Stanke, T.; van Kampen, E.; Bárta, M.; Brand, J.; Gueth, F.; Hogerheijde, M.; Bertoldi, F.; Muxlow, T.; Richards, A.; Vlemmings, W.

    2014-08-01

    The ALMA Regional Centres (ARCs) form the interface between the ALMA observatory and the user community from the proposal preparation stage to the delivery of data and their subsequent analysis. The ARCs provide critical services to both the ALMA operations in Chile and to the user community. These services were split by the ALMA project into core and additional services. The core services are financed by the ALMA operations budget and are critical to the successful operation of ALMA. They are contractual obligations and must be delivered to the ALMA project. The additional services are not funded by the ALMA project and are not contractual obligations, but are critical to achieve ALMA full scientific potential. A distributed network of ARC nodes (with ESO being the central ARC) has been set up throughout Europe at the following seven locations: Bologna, Bonn-Cologne, Grenoble, Leiden, Manchester, Ondrejov, Onsala. These ARC nodes are working together with the central node at ESO and provide both core and additional services to the ALMA user community. This paper presents the European ARC, and how it operates in Europe to support the ALMA community. This model, although complex in nature, is turning into a very successful one, providing a service to the scientific community that has been so far highly appreciated. The ARC could become a reference support model in an age where very large collaborations are required to build large facilities, and support is needed for geographically and culturally diverse communities.

  6. LDEF data correlation to existing NASA debris environment models

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Allbrooks, Martha K.; Watts, Alan J.

    1992-01-01

    The Long Duration Exposure Facility (LDEF) was recovered in January 1990, following 5.75 years exposure of about 130 sq. m to low-Earth orbit. About 25 sq. m of this surface area was aluminum 6061 T-6 exposed in every direction. In addition, about 17 sq. m of Scheldahl G411500 silver-Teflon thermal control blankets were exposed in 9 of the 12 directions. Since the LDEF was gravity gradient stabilized and did not rotate, the directional dependence of the flux can be easily distinguished. During the disintegration of the LDEF, all impact features larger than 0.5 mm into aluminum were documented for diameters and locations. In addition, the diameters and locations of all impact features larger than 0.3 mm into Scheldahl G411500 thermal control blankets were also documented. This data, along with additional information collected from LDEF materials will be compared with current meteoroid and debris models. This comparison will provide a validation of the models and will identify discrepancies between the models and the data.

  7. Genomewide association study for susceptibility genes contributing to familial Parkinson disease

    PubMed Central

    Pankratz, Nathan; Wilk, Jemma B.; Latourelle, Jeanne C.; DeStefano, Anita L.; Halter, Cheryl; Pugh, Elizabeth W.; Doheny, Kimberly F.; Gusella, James F.; Nichols, William C.

    2009-01-01

    Five genes have been identified that contribute to Mendelian forms of Parkinson disease (PD); however, mutations have been found in fewer than 5% of patients, suggesting that additional genes contribute to disease risk. Unlike previous studies that focused primarily on sporadic PD, we have performed the first genomewide association study (GWAS) in familial PD. Genotyping was performed with the Illumina HumanCNV370Duo array in 857 familial PD cases and 867 controls. A logistic model was employed to test for association under additive and recessive modes of inheritance after adjusting for gender and age. No result met genomewide significance based on a conservative Bonferroni correction. The strongest association result was with SNPs in the GAK/DGKQ region on chromosome 4 (additive model: p = 3.4 × 10−6; OR = 1.69). Consistent evidence of association was also observed to the chromosomal regions containing SNCA (additive model: p = 5.5 × 10−5; OR = 1.35) and MAPT (recessive model: p = 2.0 × 10−5; OR = 0.56). Both of these genes have been implicated previously in PD susceptibility; however, neither was identified in previous GWAS studies of PD. Meta-analysis was performed using data from a previous case–control GWAS, and yielded improved p values for several regions, including GAK/DGKQ (additive model: p = 2.5 × 10−7) and the MAPT region (recessive model: p = 9.8 × 10−6; additive model: p = 4.8 × 10−5). These data suggest the identification of new susceptibility alleles for PD in the GAK/DGKQ region, and also provide further support for the role of SNCA and MAPT in PD susceptibility. PMID:18985386

  8. Phaser.MRage: automated molecular replacement

    PubMed Central

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.

    2013-01-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement. PMID:24189240

  9. Phaser.MRage: automated molecular replacement.

    PubMed

    Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J

    2013-11-01

    Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.

  10. Using Petri nets for experimental design in a multi-organ elimination pathway.

    PubMed

    Reshetova, Polina; Smilde, Age K; Westerhuis, Johan A; van Kampen, Antoine H C

    2015-08-01

    Genistein is a soy metabolite with estrogenic activity that may result in (un)favorable effects on human health. Elucidation of the mechanisms through which food additives such as genistein exert their beneficiary effects is a major challenge for the food industry. A better understanding of the genistein elimination pathway could shed light on such mechanisms. We developed a Petri net model that represents this multi-organ elimination pathway and which assists in the design of future experiments. Using this model we show that metabolic profiles solely measured in venous blood are not sufficient to uniquely parameterize the model. Based on simulations we suggest two solutions that provide better results: parameterize the model using gut epithelium profiles or add additional biological constrains in the model. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Dengue forecasting in São Paulo city with generalized additive models, artificial neural networks and seasonal autoregressive integrated moving average models.

    PubMed

    Baquero, Oswaldo Santos; Santana, Lidia Maria Reis; Chiaravalloti-Neto, Francisco

    2018-01-01

    Globally, the number of dengue cases has been on the increase since 1990 and this trend has also been found in Brazil and its most populated city-São Paulo. Surveillance systems based on predictions allow for timely decision making processes, and in turn, timely and efficient interventions to reduce the burden of the disease. We conducted a comparative study of dengue predictions in São Paulo city to test the performance of trained seasonal autoregressive integrated moving average models, generalized additive models and artificial neural networks. We also used a naïve model as a benchmark. A generalized additive model with lags of the number of cases and meteorological variables had the best performance, predicted epidemics of unprecedented magnitude and its performance was 3.16 times higher than the benchmark and 1.47 higher that the next best performing model. The predictive models captured the seasonal patterns but differed in their capacity to anticipate large epidemics and all outperformed the benchmark. In addition to be able to predict epidemics of unprecedented magnitude, the best model had computational advantages, since its training and tuning was straightforward and required seconds or at most few minutes. These are desired characteristics to provide timely results for decision makers. However, it should be noted that predictions are made just one month ahead and this is a limitation that future studies could try to reduce.

  12. Strengthen forensic entomology in court--the need for data exploration and the validation of a generalised additive mixed model.

    PubMed

    Baqué, Michèle; Amendt, Jens

    2013-01-01

    Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.

  13. ACCELERATED FAILURE TIME MODELS PROVIDE A USEFUL STATISTICAL FRAMEWORK FOR AGING RESEARCH

    PubMed Central

    Swindell, William R.

    2009-01-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model “deceleration factor”. AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data. PMID:19007875

  14. Accelerated failure time models provide a useful statistical framework for aging research.

    PubMed

    Swindell, William R

    2009-03-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model "deceleration factor". AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  16. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  17. Tigers on trails: occupancy modeling for cluster sampling.

    PubMed

    Hines, J E; Nichols, J D; Royle, J A; MacKenzie, D I; Gopalaswamy, A M; Kumar, N Samba; Karanth, K U

    2010-07-01

    Occupancy modeling focuses on inference about the distribution of organisms over space, using temporal or spatial replication to allow inference about the detection process. Inference based on spatial replication strictly requires that replicates be selected randomly and with replacement, but the importance of these design requirements is not well understood. This paper focuses on an increasingly popular sampling design based on spatial replicates that are not selected randomly and that are expected to exhibit Markovian dependence. We develop two new occupancy models for data collected under this sort of design, one based on an underlying Markov model for spatial dependence and the other based on a trap response model with Markovian detections. We then simulated data under the model for Markovian spatial dependence and fit the data to standard occupancy models and to the two new models. Bias of occupancy estimates was substantial for the standard models, smaller for the new trap response model, and negligible for the new spatial process model. We also fit these models to data from a large-scale tiger occupancy survey recently conducted in Karnataka State, southwestern India. In addition to providing evidence of a positive relationship between tiger occupancy and habitat, model selection statistics and estimates strongly supported the use of the model with Markovian spatial dependence. This new model provides another tool for the decomposition of the detection process, which is sometimes needed for proper estimation and which may also permit interesting biological inferences. In addition to designs employing spatial replication, we note the likely existence of temporal Markovian dependence in many designs using temporal replication. The models developed here will be useful either directly, or with minor extensions, for these designs as well. We believe that these new models represent important additions to the suite of modeling tools now available for occupancy estimation in conservation monitoring. More generally, this work represents a contribution to the topic of cluster sampling for situations in which there is a need for specific modeling (e.g., reflecting dependence) for the distribution of the variable(s) of interest among subunits.

  18. Description of the National Hydrologic Model for use with the Precipitation-Runoff Modeling System (PRMS)

    USGS Publications Warehouse

    Regan, R. Steven; Markstrom, Steven L.; Hay, Lauren E.; Viger, Roland J.; Norton, Parker A.; Driscoll, Jessica M.; LaFontaine, Jacob H.

    2018-01-08

    This report documents several components of the U.S. Geological Survey National Hydrologic Model of the conterminous United States for use with the Precipitation-Runoff Modeling System (PRMS). It provides descriptions of the (1) National Hydrologic Model, (2) Geospatial Fabric for National Hydrologic Modeling, (3) PRMS hydrologic simulation code, (4) parameters and estimation methods used to compute spatially and temporally distributed default values as required by PRMS, (5) National Hydrologic Model Parameter Database, and (6) model extraction tool named Bandit. The National Hydrologic Model Parameter Database contains values for all PRMS parameters used in the National Hydrologic Model. The methods and national datasets used to estimate all the PRMS parameters are described. Some parameter values are derived from characteristics of topography, land cover, soils, geology, and hydrography using traditional Geographic Information System methods. Other parameters are set to long-established default values and computation of initial values. Additionally, methods (statistical, sensitivity, calibration, and algebraic) were developed to compute parameter values on the basis of a variety of nationally-consistent datasets. Values in the National Hydrologic Model Parameter Database can periodically be updated on the basis of new parameter estimation methods and as additional national datasets become available. A companion ScienceBase resource provides a set of static parameter values as well as images of spatially-distributed parameters associated with PRMS states and fluxes for each Hydrologic Response Unit across the conterminuous United States.

  19. Antimicrobial combinations: Bliss independence and Loewe additivity derived from mechanistic multi-hit models

    PubMed Central

    Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens

    2016-01-01

    Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials. This article is part of the themed issue ‘Evolutionary ecology of arthropod antimicrobial peptides’. PMID:27160596

  20. Choosing a model to predict hospital admission: an observational study of new variants of predictive models for case finding

    PubMed Central

    Billings, John; Georghiou, Theo; Blunt, Ian; Bardsley, Martin

    2013-01-01

    Objectives To test the performance of new variants of models to identify people at risk of an emergency hospital admission. We compared (1) the impact of using alternative data sources (hospital inpatient, A&E, outpatient and general practitioner (GP) electronic medical records) (2) the effects of local calibration on the performance of the models and (3) the choice of population denominators. Design Multivariate logistic regressions using person-level data adding each data set sequentially to test value of additional variables and denominators. Setting 5 Primary Care Trusts within England. Participants 1 836 099 people aged 18–95 registered with GPs on 31 July 2009. Main outcome measures Models to predict hospital admission and readmission were compared in terms of the positive predictive value and sensitivity for various risk strata and with the receiver operating curve C statistic. Results The addition of each data set showed moderate improvement in the number of patients identified with little or no loss of positive predictive value. However, even with inclusion of GP electronic medical record information, the algorithms identified only a small number of patients with no emergency hospital admissions in the previous 2 years. The model pooled across all sites performed almost as well as the models calibrated to local data from just one site. Using population denominators from GP registers led to better case finding. Conclusions These models provide a basis for wider application in the National Health Service. Each of the models examined produces reasonably robust performance and offers some predictive value. The addition of more complex data adds some value, but we were unable to conclude that pooled models performed less well than those in individual sites. Choices about model should be linked to the intervention design. Characteristics of patients identified by the algorithms provide useful information in the design/costing of intervention strategies to improve care coordination/outcomes for these patients. PMID:23980068

  1. Posterior Predictive Bayesian Phylogenetic Model Selection

    PubMed Central

    Lewis, Paul O.; Xie, Wangang; Chen, Ming-Hui; Fan, Yu; Kuo, Lynn

    2014-01-01

    We present two distinctly different posterior predictive approaches to Bayesian phylogenetic model selection and illustrate these methods using examples from green algal protein-coding cpDNA sequences and flowering plant rDNA sequences. The Gelfand–Ghosh (GG) approach allows dissection of an overall measure of model fit into components due to posterior predictive variance (GGp) and goodness-of-fit (GGg), which distinguishes this method from the posterior predictive P-value approach. The conditional predictive ordinate (CPO) method provides a site-specific measure of model fit useful for exploratory analyses and can be combined over sites yielding the log pseudomarginal likelihood (LPML) which is useful as an overall measure of model fit. CPO provides a useful cross-validation approach that is computationally efficient, requiring only a sample from the posterior distribution (no additional simulation is required). Both GG and CPO add new perspectives to Bayesian phylogenetic model selection based on the predictive abilities of models and complement the perspective provided by the marginal likelihood (including Bayes Factor comparisons) based solely on the fit of competing models to observed data. [Bayesian; conditional predictive ordinate; CPO; L-measure; LPML; model selection; phylogenetics; posterior predictive.] PMID:24193892

  2. The Department of the Navy Systems Engineering Career Competency Model

    DTIC Science & Technology

    2015-04-30

    competencies (Delgado, 2014). The SECCM has enhanced the current ENG model through the addition of extensive sets of KSAs mapped to each of the...SECCM then added KSA details from several other existing systems engineering competency models , many provided to the original NDIA SE WG, from a...to generate as complete a scope of SE KSA as possible. The ENG (formerly SPRDE) Career Field Competency Model was used as a basis for the set of

  3. Using a logic model to evaluate the Kids Together early education inclusion program for children with disabilities and additional needs.

    PubMed

    Clapham, Kathleen; Manning, Claire; Williams, Kathryn; O'Brien, Ginger; Sutherland, Margaret

    2017-04-01

    Despite clear evidence that learning and social opportunities for children with disabilities and special needs are more effective in inclusive not segregated settings, there are few known effective inclusion programs available to children with disabilities, their families or teachers in the early years within Australia. The Kids Together program was developed to support children with disabilities/additional needs aged 0-8 years attending mainstream early learning environments. Using a key worker transdisciplinary team model, the program aligns with the individualised package approach of the National Disability Insurance Scheme (NDIS). This paper reports on the use of a logic model to underpin the process, outcomes and impact evaluation of the Kids Together program. The research team worked across 15 Early Childhood Education and Care (ECEC) centres and in home and community settings. A realist evaluation using mixed methods was undertaken to understand what works, for whom and in what contexts. The development of a logic model provided a structured way to explore how the program was implemented and achieved short, medium and long term outcomes within a complex community setting. Kids Together was shown to be a highly effective and innovative model for supporting the inclusion of children with disabilities/additional needs in a range of environments central for early childhood learning and development. The use of a logic model provided a visual representation of the Kids Together model and its component parts and enabled a theory of change to be inferred, showing how a coordinated and collaborative approached can work across multiple environments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Using waveform information in nonlinear data assimilation

    NASA Astrophysics Data System (ADS)

    Rey, Daniel; Eldridge, Michael; Morone, Uriel; Abarbanel, Henry D. I.; Parlitz, Ulrich; Schumann-Bischoff, Jan

    2014-12-01

    Information in measurements of a nonlinear dynamical system can be transferred to a quantitative model of the observed system to establish its fixed parameters and unobserved state variables. After this learning period is complete, one may predict the model response to new forces and, when successful, these predictions will match additional observations. This adjustment process encounters problems when the model is nonlinear and chaotic because dynamical instability impedes the transfer of information from the data to the model when the number of measurements at each observation time is insufficient. We discuss the use of information in the waveform of the data, realized through a time delayed collection of measurements, to provide additional stability and accuracy to this search procedure. Several examples are explored, including a few familiar nonlinear dynamical systems and small networks of Colpitts oscillators.

  5. Inclusive practices for children and youths with communication disorders. Ad Hoc Committee on Inclusion for students with Communication Disorders.

    PubMed

    1996-01-01

    An array of inclusive service delivery models is recommended for the implementation of services to children and youths with communication disorders. Inclusive practices are intervention services that are based on the unique and specific needs of the individual, and provided in a context that is least restrictive. There are a variety of models through which inclusive practices can be provided, including a direct (pull-out) program, in classroom-based service delivery, community-based models, and consultative interventions. These models should be seen as flexible options that may change depending on student needs. The speech-language pathologist, in collaboration with parents, the student, teachers, support personnel, and administrators, is in the ideal position to decide the model or combination of models that best serves each individual student's communication needs. Implementation of inclusive practices requires consideration of multiple issues, including general education reform, cost effectiveness, and program efficacy. In addition, administrative and school system support, personnel qualifications, staff development, flexible scheduling, and the effects of inclusive practices on all learners need to be considered. At present, available research suggests guarded optimism for the effectiveness of inclusive practices. However, many critical questions have not yet been addressed and additional research is needed to assess the full impact of inclusive practices for students with communication disorders.

  6. Client satisfaction with reproductive health-care quality: integrating business approaches to modeling and measurement.

    PubMed

    Alden, Dana L; Do, Mai Hoa; Bhawuk, Dharm

    2004-12-01

    Health-care managers are increasingly interested in client perceptions of clinic service quality and satisfaction. While tremendous progress has occurred, additional perspectives on the conceptualization, modeling and measurement of these constructs may further assist health-care managers seeking to provide high-quality care. To that end, this study draws on theories from business and health to develop an integrated model featuring antecedents to and consequences of reproductive health-care client satisfaction. In addition to developing a new model, this study contributes by testing how well Western-based theories of client satisfaction hold in a developing, Asian country. Applied to urban, reproductive health clinic users in Hanoi, Vietnam, test results suggest that hypothesized antecedents such as pre-visit expectations, perceived clinic performance and how much performance exceeds expectations impact client satisfaction. However, the relative importance of these predictors appears to vary depending on a client's level of service-related experience. Finally, higher levels of client satisfaction are positively related to future clinic use intentions. This study demonstrates the value of: (1) incorporating theoretical perspectives from multiple disciplines to model processes underlying health-care satisfaction and (2) field testing those models before implementation. It also furthers research designed to provide health-care managers with actionable measures of the complex processes related to their clients' satisfaction.

  7. Towards Additive Manufacture of Functional, Spline-Based Morphometric Models of Healthy and Diseased Coronary Arteries: In Vitro Proof-of-Concept Using a Porcine Template.

    PubMed

    Jewkes, Rachel; Burton, Hanna E; Espino, Daniel M

    2018-02-02

    The aim of this study is to assess the additive manufacture of morphometric models of healthy and diseased coronary arteries. Using a dissected porcine coronary artery, a model was developed with the use of computer aided engineering, with splines used to design arteries in health and disease. The model was altered to demonstrate four cases of stenosis displaying varying severity, based on published morphometric data available. Both an Objet Eden 250 printer and a Solidscape 3Z Pro printer were used in this analysis. A wax printed model was set into a flexible thermoplastic and was valuable for experimental testing with helical flow patterns observed in healthy models, dominating the distal LAD (left anterior descending) and left circumflex arteries. Recirculation zones were detected in all models, but were visibly larger in the stenosed cases. Resin models provide useful analytical tools for understanding the spatial relationships of blood vessels, and could be applied to preoperative planning techniques, but were not suitable for physical testing. In conclusion, it is feasible to develop blood vessel models enabling experimental work; further, through additive manufacture of bio-compatible materials, there is the possibility of manufacturing customized replacement arteries.

  8. Towards Additive Manufacture of Functional, Spline-Based Morphometric Models of Healthy and Diseased Coronary Arteries: In Vitro Proof-of-Concept Using a Porcine Template

    PubMed Central

    Jewkes, Rachel; Burton, Hanna E.; Espino, Daniel M.

    2018-01-01

    The aim of this study is to assess the additive manufacture of morphometric models of healthy and diseased coronary arteries. Using a dissected porcine coronary artery, a model was developed with the use of computer aided engineering, with splines used to design arteries in health and disease. The model was altered to demonstrate four cases of stenosis displaying varying severity, based on published morphometric data available. Both an Objet Eden 250 printer and a Solidscape 3Z Pro printer were used in this analysis. A wax printed model was set into a flexible thermoplastic and was valuable for experimental testing with helical flow patterns observed in healthy models, dominating the distal LAD (left anterior descending) and left circumflex arteries. Recirculation zones were detected in all models, but were visibly larger in the stenosed cases. Resin models provide useful analytical tools for understanding the spatial relationships of blood vessels, and could be applied to preoperative planning techniques, but were not suitable for physical testing. In conclusion, it is feasible to develop blood vessel models enabling experimental work; further, through additive manufacture of bio-compatible materials, there is the possibility of manufacturing customized replacement arteries. PMID:29393899

  9. Small-molecule ligand docking into comparative models with Rosetta

    PubMed Central

    Combs, Steven A; DeLuca, Samuel L; DeLuca, Stephanie H; Lemmon, Gordon H; Nannemann, David P; Nguyen, Elizabeth D; Willis, Jordan R; Sheehan, Jonathan H; Meiler, Jens

    2017-01-01

    Structure-based drug design is frequently used to accelerate the development of small-molecule therapeutics. Although substantial progress has been made in X-ray crystallography and nuclear magnetic resonance (NMR) spectroscopy, the availability of high-resolution structures is limited owing to the frequent inability to crystallize or obtain sufficient NMR restraints for large or flexible proteins. Computational methods can be used to both predict unknown protein structures and model ligand interactions when experimental data are unavailable. This paper describes a comprehensive and detailed protocol using the Rosetta modeling suite to dock small-molecule ligands into comparative models. In the protocol presented here, we review the comparative modeling process, including sequence alignment, threading and loop building. Next, we cover docking a small-molecule ligand into the protein comparative model. In addition, we discuss criteria that can improve ligand docking into comparative models. Finally, and importantly, we present a strategy for assessing model quality. The entire protocol is presented on a single example selected solely for didactic purposes. The results are therefore not representative and do not replace benchmarks published elsewhere. We also provide an additional tutorial so that the user can gain hands-on experience in using Rosetta. The protocol should take 5–7 h, with additional time allocated for computer generation of models. PMID:23744289

  10. Turbofan forced mixer lobe flow modeling. Part 3: Application to augment engines

    NASA Technical Reports Server (NTRS)

    Barber, T.; Moore, G. C.; Blatt, J. R.

    1988-01-01

    Military engines frequently need large quantities of thrust for short periods of time. The addition of an augmentor can provide such thrust increases but with a penalty of increased duct length and engine weight. The addition of a forced mixer to the augmentor improves performance and reduces the penalty, as well as providing a method for siting the required flame holders. In this report two augmentor concepts are investigated: a swirl-mixer augmentor and a mixer-flameholder augmentor. Several designs for each concept are included and an experimental assessment of one of the swirl-mixer augmentors is presented.

  11. The Elaboration Likelihood Model and Proxemic Violations as Peripheral Cues to Information Processing.

    ERIC Educational Resources Information Center

    Eaves, Michael

    This paper provides a literature review of the elaboration likelihood model (ELM) as applied in persuasion. Specifically, the paper addresses distraction with regard to effects on persuasion. In addition, the application of proxemic violations as peripheral cues in message processing is discussed. Finally, the paper proposes to shed new light on…

  12. Models of Technology Management at the Community College: The Role of the Chief Information Officer

    ERIC Educational Resources Information Center

    Armstrong, Scott; Simer, Lauren; Spaniol, Lee

    2011-01-01

    Community colleges provide a wide range of educational services to very diverse groups of students. For that reason, the variety and flexibility of services provided can be critical. In addition, quickly changing needs result in quickly changing system requirements. In this chapter, community college CIOs speak to their roles, focusing on the…

  13. A Nurse-Led Innovation in Education: Implementing a Collaborative Multidisciplinary Grand Rounds.

    PubMed

    Matamoros, Lisa; Cook, Michelle

    2017-08-01

    Multidisciplinary grand rounds provides an opportunity to promote excellence in patient care through scholarly presentations and interdisciplinary collaboration with an innovative approach. In addition, multidisciplinary grand rounds serves to recognize expertise of staff, mentor and support professional development, and provide a collaborative environment across all clinical disciplines and support services. This article describes a process model developed by nurse educators for implementing a multidisciplinary grand rounds program. The components of the process model include topic submissions, coaching presenters, presentations, evaluations, and spreading the work. This model can be easily implemented at any organization. J Contin Educ Nurs. 2017;48(8):353-357. Copyright 2017, SLACK Incorporated.

  14. Computational social network modeling of terrorist recruitment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Nina M.; Turnley, Jessica Glicken; Smrcka, Julianne D.

    2004-10-01

    The Seldon terrorist model represents a multi-disciplinary approach to developing organization software for the study of terrorist recruitment and group formation. The need to incorporate aspects of social science added a significant contribution to the vision of the resulting Seldon toolkit. The unique addition of and abstract agent category provided a means for capturing social concepts like cliques, mosque, etc. in a manner that represents their social conceptualization and not simply as a physical or economical institution. This paper provides an overview of the Seldon terrorist model developed to study the formation of cliques, which are used as the majormore » recruitment entity for terrorist organizations.« less

  15. A Note About HARP's State Trimming Method

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Hayhurst, Kelly J.; Johnson, Sally C.

    1998-01-01

    This short note provides some additional insight into how the HARP program works. In some cases, it is possible for HARP to tdm away too many states and obtain an optimistic result. The HARP Version 7.0 manual warns the user that 'Unlike the ALL model, the SAME model can automatically drop failure modes for certain system models. The user is cautioned to insure that no important failure modes are dropped; otherwise, a non-conservative result can be given.' This note provides an example of where this occurs and a pointer to further documentation that gives a means of bounding the error associated with trimming these states.

  16. CRAC2 model description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.

  17. Detailed Characterization of Nearshore Processes During NCEX

    NASA Astrophysics Data System (ADS)

    Holland, K.; Kaihatu, J. M.; Plant, N.

    2004-12-01

    Recent technology advances have allowed the coupling of remote sensing methods with advanced wave and circulation models to yield detailed characterizations of nearshore processes. This methodology was demonstrated as part of the Nearshore Canyon EXperiment (NCEX) in La Jolla, CA during Fall 2003. An array of high-resolution, color digital cameras was installed to monitor an alongshore distance of nearly 2 km out to depths of 25 m. This digital imagery was analyzed over the three-month period through an automated process to produce hourly estimates of wave period, wave direction, breaker height, shoreline position, sandbar location, and bathymetry at numerous locations during daylight hours. Interesting wave propagation patterns in the vicinity of the canyons were observed. In addition, directional wave spectra and swash / surf flow velocities were estimated using more computationally intensive methods. These measurements were used to provide forcing and boundary conditions for the Delft3D wave and circulation model, giving additional estimates of nearshore processes such as dissipation and rip currents. An optimal approach for coupling these remotely sensed observations to the numerical model was selected to yield accurate, but also timely characterizations. This involved assimilation of directional spectral estimates near the offshore boundary to mimic forcing conditions achieved under traditional approaches involving nested domains. Measurements of breaker heights and flow speeds were also used to adaptively tune model parameters to provide enhanced accuracy. Comparisons of model predictions and video observations show significant correlation. As compared to nesting within larger-scale and coarser resolution models, the advantages of providing boundary conditions data using remote sensing is much improved resolution and fidelity. For example, rip current development was both modeled and observed. These results indicate that this approach to data-model coupling is tenable and may be useful in near-real-time characterizations required by many applied scenarios.

  18. Constraining convective regions with asteroseismic linear structural inversions

    NASA Astrophysics Data System (ADS)

    Buldgen, G.; Reese, D. R.; Dupret, M. A.

    2018-01-01

    Context. Convective regions in stellar models are always associated with uncertainties, for example, due to extra-mixing or the possible inaccurate position of the transition from convective to radiative transport of energy. Such inaccuracies have a strong impact on stellar models and the fundamental parameters we derive from them. The most promising method to reduce these uncertainties is to use asteroseismology to derive appropriate diagnostics probing the structural characteristics of these regions. Aims: We wish to use custom-made integrated quantities to improve the capabilities of seismology to probe convective regions in stellar interiors. By doing so, we hope to increase the number of indicators obtained with structural seismic inversions to provide additional constraints on stellar models and the fundamental parameters we determine from theoretical modeling. Methods: First, we present new kernels associated with a proxy of the entropy in stellar interiors. We then show how these kernels can be used to build custom-made integrated quantities probing convective regions inside stellar models. We present two indicators suited to probe convective cores and envelopes, respectively, and test them on artificial data. Results: We show that it is possible to probe both convective cores and envelopes using appropriate indicators obtained with structural inversion techniques. These indicators provide direct constraints on a proxy of the entropy of the stellar plasma, sensitive to the characteristics of convective regions. These constraints can then be used to improve the modeling of solar-like stars by providing an additional degree of selection of models obtained from classical forward modeling approaches. We also show that in order to obtain very accurate indicators, we need ℓ = 3 modes for the envelope but that the core-conditions indicator is more flexible in terms of the seismic data required for its use.

  19. Government Contracting Under the Javits-Wagner-O’Day Act

    DTIC Science & Technology

    1991-12-01

    to manufacture its commodities, or provide its services. Likewise, a qualified work center for the severely disabled must employ personnel with... manufacture (CAD/CAM) systems. Additionally, it studies a unique and innovative business arrangement that serves as a model of contract efficiency and...Additional research areas include the types of commodities currently manufactured in the workshops, tha barriers to enlarging the commodity 3 base, the

  20. Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew

    2017-01-15

    This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less

  1. Modeling process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  2. Nursing Approach Based on Roy Adaptation Model in a Patient Undergoing Breast Conserving Surgery for Breast Cancer.

    PubMed

    Ursavaş, Figen Erol; Karayurt, Özgül; İşeri, Özge

    2014-07-01

    The use of models in nursing provides nurses to focus on the role of nursing and its applications rather than medical practice. In addition, it helps patient care to be systematic, purposeful, controlled and effective. One of the commonly used models in nursing is Roy Adaptation Model. According to Roy adaptation model, the aim of nursing is to increase compliance and life expectancy. Roy Adaptation Model evaluates the patient in physiologic mode, self-concept mode, role function mode and interdependence mode aiming to provide holistic care. This article describes the use of Roy Adaptation Model in the care of a patient who has been diagnosed with breast cancer and had breast-conserving surgery. Patient data was evaluated in the four modes of Roy adaptation model (physiologic, self-concept, role function, and interdependence modes) and the nursing process was applied.

  3. libNeuroML and PyLEMS: using Python to combine procedural and declarative modeling approaches in computational neuroscience.

    PubMed

    Vella, Michael; Cannon, Robert C; Crook, Sharon; Davison, Andrew P; Ganapathy, Gautham; Robinson, Hugh P C; Silver, R Angus; Gleeson, Padraig

    2014-01-01

    NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.

  4. libNeuroML and PyLEMS: using Python to combine procedural and declarative modeling approaches in computational neuroscience

    PubMed Central

    Vella, Michael; Cannon, Robert C.; Crook, Sharon; Davison, Andrew P.; Ganapathy, Gautham; Robinson, Hugh P. C.; Silver, R. Angus; Gleeson, Padraig

    2014-01-01

    NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment. PMID:24795618

  5. Embedded-explicit emergent literacy intervention I: Background and description of approach.

    PubMed

    Justice, Laura M; Kaderavek, Joan N

    2004-07-01

    This article, the first of a two-part series, provides background information and a general description of an emergent literacy intervention model for at-risk preschoolers and kindergartners. The embedded-explicit intervention model emphasizes the dual importance of providing young children with socially embedded opportunities for meaningful, naturalistic literacy experiences throughout the day, in addition to regular structured therapeutic interactions that explicitly target critical emergent literacy goals. The role of the speech-language pathologist (SLP) in the embedded-explicit model encompasses both indirect and direct service delivery: The SLP consults and collaborates with teachers and parents to ensure the highest quality and quantity of socially embedded literacy-focused experiences and serves as a direct provider of explicit interventions using structured curricula and/or lesson plans. The goal of this integrated model is to provide comprehensive emergent literacy interventions across a spectrum of early literacy skills to ensure the successful transition of at-risk children from prereaders to readers.

  6. A study of wind effects on collector performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onur, N.; Hewitt, J.C. Jr.

    1980-08-01

    Convective heat transfer experiments have been run on flat-plate collectors for tilt angles ranging from the horizontal to the vertical and for five different flow velocities. Experimental data are used to evaluate the currently used models, namely, those of Jurges (1924), Drake (1948), and Sparrow et al (1970-79), and it is shown that although none of these models provides an exact fit, they do represent bounds for the present data. It is also shown that the effect of flow from the northern quadrants provides an additional heat loss reduction of 10 to 20%.

  7. Monitoring Coastal Marshes for Persistent Saltwater Intrusion

    NASA Technical Reports Server (NTRS)

    Kalcic, Maria; Hall, Callie; Fletcher, Rose; Russell, Jeff

    2009-01-01

    Primary goal: Provide resource managers with remote sensing products that support ecosystem forecasting models requiring salinity and inundation data. Work supports the habitat-switching modules in the Coastal Louisiana Ecosystem Assessment and Restoration (CLEAR) model, which provides scientific evaluation for restoration management (Visser et al., 2008). Ongoing work to validate flooding with radar (NWRC/USGS) and enhance persistence estimates through "fusion" of MODIS and Landsat time series (ROSES A.28 Gulf of Mexico). Additional work will also investigate relationship between saltwater dielectric constant and radar returns (Radarsat) (ROSES A.28 Gulf of Mexico).

  8. A review of nuclear thermal propulsion carbide fuel corrosion and key issues

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; El-Genk, Mohamed S.

    1994-01-01

    Corrosion (mass loss) of carbide nuclear fuels due to their exposure to hot hydrogen in nuclear thermal propulsion engine systems greatly impacts the performance, thrust-to-weight and life of such systems. This report provides an overview of key issues and processes associated with the corrosion of carbide materials. Additionally, past pertinent development reactor test observations, as well as related experimental work and analysis modeling efforts are reviewed. At the conclusion, recommendations are presented, which provide the foundation for future corrosion modeling and verification efforts.

  9. SO(10) supersymmetric grand unified theories

    NASA Astrophysics Data System (ADS)

    Dermisek, Radovan

    The origin of the fermion mass hierarchy is one of the most challenging problems in elementary particle physics. In the standard model fermion masses and mixing angles are free parameters. Supersymmetric grand unified theories provide a beautiful framework for physics beyond the standard model. In addition to gauge coupling unification these theories provide relations between quark and lepton masses within families, and with additional family symmetry the hierarchy between families can be generated. We present a predictive SO(10) supersymmetric grand unified model with D 3 x U(1) family symmetry. The hierarchy in fermion masses is generated by the family symmetry breaking D 3 x U(1) → ZN → nothing. This model fits the low energy data in the charged fermion sector quite well. We discuss the prediction of this model for the proton lifetime in light of recent SuperKamiokande results and present a clear picture of the allowed spectra of supersymmetric particles. Finally, the detailed discussion of the Yukawa coupling unification of the third generation particles is provided. We find a narrow region is consistent with t, b, tau Yukawa unification for mu > 0 (suggested by b → sgamma and the anomalous magnetic moment of the muon) with A0 ˜ -1.9m16, m10 ˜ 1.4m16, m16 ≳ 1200 GeV and mu, M1/2 ˜ 100--500 GeV. Demanding Yukawa unification thus makes definite predictions for Higgs and sparticle masses.

  10. Virulence as a model for interplanetary and interstellar colonization - parasitism or mutualism?

    NASA Astrophysics Data System (ADS)

    Starling, Jonathan; Forgan, Duncan H.

    2014-01-01

    In the light of current scientific assessments of human-induced climate change, we investigate an experimental model to inform how resource-use strategies may influence interplanetary and interstellar colonization by intelligent civilizations. In doing so, we seek to provide an additional aspect for refining the famed Fermi Paradox. The model described is necessarily simplistic, and the intent is to simply obtain some general insights to inform and inspire additional models. We model the relationship between an intelligent civilization and its host planet as symbiotic, where the relationship between the symbiont and the host species (the civilization and the planet's ecology, respectively) determines the fitness and ultimate survival of both organisms. We perform a series of Monte Carlo Realization simulations, where civilizations pursue a variety of different relationships/strategies with their host planet, from mutualism to parasitism, and can consequently `infect' other planets/hosts. We find that parasitic civilizations are generally less effective at survival than mutualist civilizations, provided that interstellar colonization is inefficient (the maximum velocity of colonization/infection is low). However, as the colonization velocity is increased, the strategy of parasitism becomes more successful, until they dominate the `population'. This is in accordance with predictions based on island biogeography and r/K selection theory. While heavily assumption dependent, we contend that this provides a fertile approach for further application of insights from theoretical ecology for extraterrestrial colonization - while also potentially offering insights for understanding the human-Earth relationship and the potential for extraterrestrial human colonization.

  11. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Inventor); Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  12. CEOS SEO and GISS Meeting

    NASA Technical Reports Server (NTRS)

    Killough, Brian; Stover, Shelley

    2008-01-01

    The Committee on Earth Observation Satellites (CEOS) provides a brief to the Goddard Institute for Space Studies (GISS) regarding the CEOS Systems Engineering Office (SEO) and current work on climate requirements and analysis. A "system framework" is provided for the Global Earth Observation System of Systems (GEOSS). SEO climate-related tasks are outlined including the assessment of essential climate variable (ECV) parameters, use of the "systems framework" to determine relevant informational products and science models and the performance of assessments and gap analyses of measurements and missions for each ECV. Climate requirements, including instruments and missions, measurements, knowledge and models, and decision makers, are also outlined. These requirements would establish traceability from instruments to products and services allowing for benefit evaluation of instruments and measurements. Additionally, traceable climate requirements would provide a better understanding of global climate models.

  13. Telecom Modeling with ChatterBell.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jrad, Ahmad M.; Kelic, Andjelka

    This document provides a description and user manual for the ChatterBell voice telecom modeling and simulation capability. The intended audience consists of network planners and practitioners who wish to use the tool to model a particular voice network and analyze its behavior under varying assumptions and possible failure conditions. ChatterBell is built on top of the N-SMART voice simulation and visualization suite that was developed through collaboration between Sandia National Laboratories and Bell Laboratories of Lucent Technologies. The new and improved modeling and simulation tool has been modified and modernized to incorporate the latest development in the telecom world includingmore » the widespread use of VoIP technology. In addition, ChatterBell provides new commands and modeling capabilities that were not available in the N-SMART application.« less

  14. Modelling the physics in iterative reconstruction for transmission computed tomography

    PubMed Central

    Nuyts, Johan; De Man, Bruno; Fessler, Jeffrey A.; Zbijewski, Wojciech; Beekman, Freek J.

    2013-01-01

    There is an increasing interest in iterative reconstruction (IR) as a key tool to improve quality and increase applicability of X-ray CT imaging. IR has the ability to significantly reduce patient dose, it provides the flexibility to reconstruct images from arbitrary X-ray system geometries and it allows to include detailed models of photon transport and detection physics, to accurately correct for a wide variety of image degrading effects. This paper reviews discretisation issues and modelling of finite spatial resolution, Compton scatter in the scanned object, data noise and the energy spectrum. Widespread implementation of IR with highly accurate model-based correction, however, still requires significant effort. In addition, new hardware will provide new opportunities and challenges to improve CT with new modelling. PMID:23739261

  15. Mars Global Reference Atmospheric Model (Mars-GRAM 3.34): Programmer's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, Bonnie F.; Johnson, Dale L.

    1996-01-01

    This is a programmer's guide for the Mars Global Reference Atmospheric Model (Mars-GRAM 3.34). Included are a brief history and review of the model since its origin in 1988 and a technical discussion of recent additions and modifications. Examples of how to run both the interactive and batch (subroutine) forms are presented. Instructions are provided on how to customize output of the model for various parameters of the Mars atmosphere. Detailed descriptions are given of the main driver programs, subroutines, and associated computational methods. Lists and descriptions include input, output, and local variables in the programs. These descriptions give a summary of program steps and 'map' of calling relationships among the subroutines. Definitions are provided for the variables passed between subroutines through common lists. Explanations are provided for all diagnostic and progress messages generated during execution of the program. A brief outline of future plans for Mars-GRAM is also presented.

  16. A New Formulation for Hybrid LES-RANS Computations

    NASA Technical Reports Server (NTRS)

    Woodruff, Stephen L.

    2013-01-01

    Ideally, a hybrid LES-RANS computation would employ LES only where necessary to make up for the failure of the RANS model to provide sufficient accuracy or to provide time-dependent information. Current approaches are fairly restrictive in the placement of LES and RANS regions; an LES-RANS transition in a boundary layer, for example, yields an unphysical log-layer shift. A hybrid computation is formulated here to allow greater control over the placement of LES and RANS regions and the transitions between them. The concept of model invariance is introduced, which provides a basis for interpreting hybrid results within an LES-RANS transition zone. Consequences of imposing model invariance include the addition of terms to the governing equations that compensate for unphysical gradients created as the model changes between RANS and LES. Computational results illustrate the increased accuracy of the approach and its insensitivity to the location of the transition and to the blending function employed.

  17. The Swarm Initial Field Model for the 2014 Geomagnetic Field

    NASA Technical Reports Server (NTRS)

    Olsen, Nils; Hulot, Gauthier; Lesur, Vincent; Finlay, Christopher C.; Beggan, Ciaran; Chulliat, Arnaud; Sabaka, Terence J.; Floberghagen, Rune; Friis-Christensen, Eigil; Haagmans, Roger

    2015-01-01

    Data from the first year of ESA's Swarm constellation mission are used to derive the Swarm Initial Field Model (SIFM), a new model of the Earth's magnetic field and its time variation. In addition to the conventional magnetic field observations provided by each of the three Swarm satellites, explicit advantage is taken of the constellation aspect by including east-west magnetic intensity gradient information from the lower satellite pair. Along-track differences in magnetic intensity provide further information concerning the north-south gradient. The SIFM static field shows excellent agreement (up to at least degree 60) with recent field models derived from CHAMP data, providing an initial validation of the quality of the Swarm magnetic measurements. Use of gradient data improves the determination of both the static field and its secular variation, with the mean misfit for east-west intensity differences between the lower satellite pair being only 0.12 nT.

  18. Sediment sorting along tidal sand waves: A comparison between field observations and theoretical predictions

    NASA Astrophysics Data System (ADS)

    Van Oyen, Tomas; Blondeaux, Paolo; Van den Eynde, Dries

    2013-07-01

    A site-by-site comparison between field observations and theoretical predictions of sediment sorting patterns along tidal sand waves is performed for ten locations in the North Sea. At each site, the observed grain size distribution along the bottom topography and the geometry of the bed forms is described in detail and the procedure used to obtain the model parameters is summarized. The model appears to accurately describe the wavelength of the observed sand waves for the majority of the locations; still providing a reliable estimate for the other sites. In addition, it is found that for seven out of the ten locations, the qualitative sorting process provided by the model agrees with the observed grain size distribution. A discussion of the site-by-site comparison is provided which, taking into account uncertainties in the field data, indicates that the model grasps the major part of the key processes controlling the phenomenon.

  19. Modeling of Non-Isothermal Cryogenic Fluid Sloshing

    NASA Technical Reports Server (NTRS)

    Agui, Juan H.; Moder, Jeffrey P.

    2015-01-01

    A computational fluid dynamic model was used to simulate the thermal destratification in an upright self-pressurized cryostat approximately half-filled with liquid nitrogen and subjected to forced sinusoidal lateral shaking. A full three-dimensional computational grid was used to model the tank dynamics, fluid flow and thermodynamics using the ANSYS Fluent code. A non-inertial grid was used which required the addition of momentum and energy source terms to account for the inertial forces, energy transfer and wall reaction forces produced by the shaken tank. The kinetics-based Schrage mass transfer model provided the interfacial mass transfer due to evaporation and condensation at the sloshing interface. The dynamic behavior of the sloshing interface, its amplitude and transition to different wave modes, provided insight into the fluid process at the interface. The tank pressure evolution and temperature profiles compared relatively well with the shaken cryostat experimental test data provided by the Centre National D'Etudes Spatiales.

  20. Variational learning and bits-back coding: an information-theoretic view to Bayesian learning.

    PubMed

    Honkela, Antti; Valpola, Harri

    2004-07-01

    The bits-back coding first introduced by Wallace in 1990 and later by Hinton and van Camp in 1993 provides an interesting link between Bayesian learning and information-theoretic minimum-description-length (MDL) learning approaches. The bits-back coding allows interpreting the cost function used in the variational Bayesian method called ensemble learning as a code length in addition to the Bayesian view of misfit of the posterior approximation and a lower bound of model evidence. Combining these two viewpoints provides interesting insights to the learning process and the functions of different parts of the model. In this paper, the problem of variational Bayesian learning of hierarchical latent variable models is used to demonstrate the benefits of the two views. The code-length interpretation provides new views to many parts of the problem such as model comparison and pruning and helps explain many phenomena occurring in learning.

  1. The Lindsay Leg Club: supporting the NHS to provide leg ulcer care.

    PubMed

    McKenzie, Morag

    2013-06-01

    Public health services will need to cope with additional demands due to an ageing society and the increasing prevalence of chronic conditions. Lower-limb ulceration is a long-term, life-changing condition and leg ulcer management can be challenging for nursing staff. The Lindsay Leg Club model is a unique partnership between community nurses, members and the local community, which provides quality of care and empowerment for patients with leg ulcers, while also supporting and educating nursing staff. The Leg Club model works in accord with core themes of Government and NHS policy. Patient feedback on the Leg Club model is positive and the Leg Clubs provide a service to members which is well accepted by patients, yet is more economically efficient than the traditional district nursing practice of home visits. Lindsay Leg Clubs provide a valuable support service to the NHS in delivering improved quality of care while improving efficiency.

  2. Digital Hydrologic Networks Supporting Applications Related to Spatially Referenced Regression Modeling

    USGS Publications Warehouse

    Brakebill, J.W.; Wolock, D.M.; Terziotti, S.E.

    2011-01-01

    Digital hydrologic networks depicting surface-water pathways and their associated drainage catchments provide a key component to hydrologic analysis and modeling. Collectively, they form common spatial units that can be used to frame the descriptions of aquatic and watershed processes. In addition, they provide the ability to simulate and route the movement of water and associated constituents throughout the landscape. Digital hydrologic networks have evolved from derivatives of mapping products to detailed, interconnected, spatially referenced networks of water pathways, drainage areas, and stream and watershed characteristics. These properties are important because they enhance the ability to spatially evaluate factors that affect the sources and transport of water-quality constituents at various scales. SPAtially Referenced Regressions On Watershed attributes (SPARROW), a process-based/statistical model, relies on a digital hydrologic network in order to establish relations between quantities of monitored contaminant flux, contaminant sources, and the associated physical characteristics affecting contaminant transport. Digital hydrologic networks modified from the River Reach File (RF1) and National Hydrography Dataset (NHD) geospatial datasets provided frameworks for SPARROW in six regions of the conterminous United States. In addition, characteristics of the modified RF1 were used to update estimates of mean-annual streamflow. This produced more current flow estimates for use in SPARROW modeling. ?? 2011 American Water Resources Association. This article is a U.S. Government work and is in the public domain in the USA.

  3. Curcumin inhibits cancer stem cell phenotypes in ex vivo models of colorectal liver metastases, and is clinically safe and tolerable in combination with FOLFOX chemotherapy

    PubMed Central

    James, Mark I.; Iwuji, Chinenye; Irving, Glen; Karmokar, Ankur; Higgins, Jennifer A.; Griffin-Teal, Nicola; Thomas, Anne; Greaves, Peter; Cai, Hong; Patel, Samita R.; Morgan, Bruno; Dennison, Ashley; Metcalfe, Matthew; Garcea, Giuseppe; Lloyd, David M.; Berry, David P.; Steward, William P.; Howells, Lynne M.; Brown, Karen

    2015-01-01

    In vitro and pre-clinical studies have suggested that addition of the diet-derived agent curcumin may provide a suitable adjunct to enhance efficacy of chemotherapy in models of colorectal cancer. However, the majority of evidence for this currently derives from established cell lines. Here, we utilised patient-derived colorectal liver metastases (CRLM) to assess whether curcumin may provide added benefit over 5-fluorouracil (5-FU) and oxaliplatin (FOLFOX) in cancer stem cell (CSC) models. Combination of curcumin with FOLFOX chemotherapy was then assessed clinically in a phase I dose escalation study. Curcumin alone and in combination significantly reduced spheroid number in CRLM CSC models, and decreased the number of cells with high aldehyde dehydrogenase activity (ALDHhigh/CD133−). Addition of curcumin to oxaliplatin/5-FU enhanced anti-proliferative and pro-apoptotic effects in a proportion of patient-derived explants, whilst reducing expression of stem cell-associated markers ALDH and CD133. The phase I dose escalation study revealed curcumin to be a safe and tolerable adjunct to FOLFOX chemotherapy in patients with CRLM (n = 12) at doses up to 2 grams daily. Curcumin may provide added benefit in subsets of patients when administered with FOLFOX, and is a well-tolerated chemotherapy adjunct. PMID:25979230

  4. Theoretical foundation for measuring the groundwater age distribution.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, William Payton; Arnold, Bill Walter

    2014-01-01

    In this study, we use PFLOTRAN, a highly scalable, parallel, flow and reactive transport code to simulate the concentrations of 3H, 3He, CFC-11, CFC-12, CFC-113, SF6, 39Ar, 81Kr, 4He and themean groundwater age in heterogeneous fields on grids with an excess of 10 million nodes. We utilize this computational platform to simulate the concentration of multiple tracers in high-resolution, heterogeneous 2-D and 3-D domains, and calculate tracer-derived ages. Tracer-derived ages show systematic biases toward younger ages when the groundwater age distribution contains water older than the maximum tracer age. The deviation of the tracer-derived age distribution from the true groundwatermore » age distribution increases with increasing heterogeneity of the system. However, the effect of heterogeneity is diminished as the mean travel time gets closer the tracer age limit. Age distributions in 3-D domains differ significantly from 2-D domains. 3D simulations show decreased mean age, and less variance in age distribution for identical heterogeneity statistics. High-performance computing allows for investigation of tracer and groundwater age systematics in high-resolution domains, providing a platform for understanding and utilizing environmental tracer and groundwater age information in heterogeneous 3-D systems. Groundwater environmental tracers can provide important constraints for the calibration of groundwater flow models. Direct simulation of environmental tracer concentrations in models has the additional advantage of avoiding assumptions associated with using calculated groundwater age values. This study quantifies model uncertainty reduction resulting from the addition of environmental tracer concentration data. The analysis uses a synthetic heterogeneous aquifer and the calibration of a flow and transport model using the pilot point method. Results indicate a significant reduction in the uncertainty in permeability with the addition of environmental tracer data, relative to the use of hydraulic measurements alone. Anthropogenic tracers and their decay products, such as CFC11, 3H, and 3He, provide significant constraint oninput permeability values in the model. Tracer data for 39Ar provide even more complete information on the heterogeneity of permeability and variability in the flow system than the anthropogenic tracers, leading to greater parameter uncertainty reduction.« less

  5. Generation of High Frequency Response in a Dynamically Loaded, Nonlinear Soil Column

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-08-01

    Detailed guidance on linear seismic analysis of soil columns is provided in “Seismic Analysis of Safety-Related Nuclear Structures and Commentary (ASCE 4, 1998),” which is currently under revision. A new Appendix in ASCE 4-2014 (draft) is being added to provide guidance for nonlinear time domain analysis which includes evaluation of soil columns. When performing linear analysis, a given soil column is typically evaluated with a linear, viscous damped constitutive model. When submitted to a sine wave motion, this constitutive model produces a smooth hysteresis loop. For nonlinear analysis, the soil column can be modelled with an appropriate nonlinear hysteretic soilmore » model. For the model in this paper, the stiffness and energy absorption result from a defined post yielding shear stress versus shear strain curve. This curve is input with tabular data points. When submitted to a sine wave motion, this constitutive model produces a hysteresis loop that looks similar in shape to the input tabular data points on the sides with discontinuous, pointed ends. This paper compares linear and nonlinear soil column results. The results show that the nonlinear analysis produces additional high frequency response. The paper provides additional study to establish what portion of the high frequency response is due to numerical noise associated with the tabular input curve and what portion is accurately caused by the pointed ends of the hysteresis loop. Finally, the paper shows how the results are changed when a significant structural mass is added to the top of the soil column.« less

  6. Modeling Gas Exchange in a Closed Plant Growth Chamber

    NASA Technical Reports Server (NTRS)

    Cornett, J. D.; Hendrix, J. E.; Wheeler, R. M.; Ross, C. W.; Sadeh, W. Z.

    1994-01-01

    Fluid transport models for fluxes of water vapor and CO2 have been developed for one crop of wheat and three crops of soybean grown in a closed plant a growth chamber. Correspondence among these fluxes is discussed. Maximum fluxes of gases are provided for engineering design requirements of fluid recycling equipment in growth chambers. Furthermore, to investigate the feasibility of generalized crop models, dimensionless representations of water vapor fluxes are presented. The feasibility of such generalized models and the need for additional data are discussed.

  7. Modeling gas exchange in a closed plant growth chamber

    NASA Technical Reports Server (NTRS)

    Cornett, J. D.; Hendrix, J. E.; Wheeler, R. M.; Ross, C. W.; Sadeh, W. Z.

    1994-01-01

    Fluid transport models for fluxes of water vapor and CO2 have been developed for one crop of wheat and three crops of soybean grown in a closed plant growth chamber. Correspondence among these fluxes is discussed. Maximum fluxes of gases are provided for engineering design requirements of fluid recycling equipment in growth chambers. Furthermore, to investigate the feasibility of generalized crop models, dimensionless representations of water vapor fluxes are presented. The feasibility of such generalized models and the need for additional data are discussed.

  8. Preclinical QSP Modeling in the Pharmaceutical Industry: An IQ Consortium Survey Examining the Current Landscape

    PubMed Central

    Wu, Fan; Bansal, Loveleena; Bradshaw‐Pierce, Erica; Chan, Jason R.; Liederer, Bianca M.; Mettetal, Jerome T.; Schroeder, Patricia; Schuck, Edgar; Tsai, Alice; Xu, Christine; Chimalakonda, Anjaneya; Le, Kha; Penney, Mark; Topp, Brian; Yamada, Akihiro

    2018-01-01

    A cross‐industry survey was conducted to assess the landscape of preclinical quantitative systems pharmacology (QSP) modeling within pharmaceutical companies. This article presents the survey results, which provide insights on the current state of preclinical QSP modeling in addition to future opportunities. Our results call attention to the need for an aligned definition and consistent terminology around QSP, yet highlight the broad applicability and benefits preclinical QSP modeling is currently delivering. PMID:29349875

  9. The NASA/MSFC global reference atmospheric model: 1990 version (GRAM-90). Part 1: Technical/users manual

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Alyea, F. N.; Cunnold, D. M.; Jeffries, W. R., III; Johnson, D. L.

    1991-01-01

    A technical description of the NASA/MSFC Global Reference Atmospheric Model 1990 version (GRAM-90) is presented with emphasis on the additions and new user's manual descriptions of the program operation aspects of the revised model. Some sample results for the new middle atmosphere section and comparisons with results from a three dimensional circulation model are provided. A programmer's manual with more details for those wishing to make their own GRAM program adaptations is also presented.

  10. Percolation Theory and Modern Hydraulic Fracturing

    NASA Astrophysics Data System (ADS)

    Norris, J. Q.; Turcotte, D. L.; Rundle, J. B.

    2015-12-01

    During the past few years, we have been developing a percolation model for fracking. This model provides a powerful tool for understanding the growth and properties of the complex fracture networks generated during a modern high volume hydraulic fracture stimulations of tight shale reservoirs. The model can also be used to understand the interaction between the growing fracture network and natural reservoir features such as joint sets and faults. Additionally, the model produces a power-law distribution of bursts which can easily be compared to observed microseismicity.

  11. Modeling Magnetic Flux-Ropes Structures

    NASA Astrophysics Data System (ADS)

    Nieves-Chinchilla, T.; Linton, M.; Hidalgo, M. A. U.; Vourlidas, A.; Savani, N.; Szabo, A.; Farrugia, C. J.; Yu, W.

    2015-12-01

    Flux-ropes are usually associated with magnetic structures embedded in the interplanetary Coronal Mass Ejections (ICMEs) with a depressed proton temperature (called Magnetic Clouds, MCs). However, small-scale flux-ropes in the solar wind are also identified with different formation, evolution, and dynamic involved. We present an analytical model to describe magnetic flux-rope topologies. The model is generalized to different grades of complexity. It extends the circular-cylindrical concept of Hidalgo et al. (2002) by introducing a general form for the radial dependence of the current density. This generalization provides information on the force distribution inside the flux rope in addition to the usual parameters of flux-rope geometrical information and orientation. The generalized model provides flexibility for implementation in 3-D MHD simulations.

  12. Modeling and Bayesian parameter estimation for shape memory alloy bending actuators

    NASA Astrophysics Data System (ADS)

    Crews, John H.; Smith, Ralph C.

    2012-04-01

    In this paper, we employ a homogenized energy model (HEM) for shape memory alloy (SMA) bending actuators. Additionally, we utilize a Bayesian method for quantifying parameter uncertainty. The system consists of a SMA wire attached to a flexible beam. As the actuator is heated, the beam bends, providing endoscopic motion. The model parameters are fit to experimental data using an ordinary least-squares approach. The uncertainty in the fit model parameters is then quantified using Markov Chain Monte Carlo (MCMC) methods. The MCMC algorithm provides bounds on the parameters, which will ultimately be used in robust control algorithms. One purpose of the paper is to test the feasibility of the Random Walk Metropolis algorithm, the MCMC method used here.

  13. Plasticity models of material variability based on uncertainty quantification techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Reese E.; Rizzi, Francesco; Boyce, Brad

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQmore » techniques can be used in model selection and assessing the quality of calibrated physical parameters.« less

  14. Couple resilience to economic pressure.

    PubMed

    Conger, R D; Rueter, M A; Elder, G H

    1999-01-01

    Over 400 married couples participated in a 3-year prospective study of economic pressure and marital relations. The research (a) empirically evaluated the family stress model of economic stress influences on marital distress and (b) extended the model to include specific interactional characteristics of spouses hypothesized to protect against economic pressure. Findings provided support for the basic mediational model, which proposes that economic pressure increases risk for emotional distress, which, in turn, increases risk for marital conflict and subsequent marital distress. Regarding resilience to economic stress, high marital support reduced the association between economic pressure and emotional distress. In addition, effective couple problem solving reduced the adverse influence of marital conflict on marital distress. Overall, the findings provided substantial support for the extended family stress model.

  15. Emerging In Vitro Liver Technologies for Drug Metabolism and Inter-Organ Interactions

    PubMed Central

    Bale, Shyam Sundhar; Moore, Laura

    2016-01-01

    In vitro liver models provide essential information for evaluating drug metabolism, metabolite formation, and hepatotoxicity. Interfacing liver models with other organ models could provide insights into the desirable as well as unintended systemic side effects of therapeutic agents and their metabolites. Such information is invaluable for drug screening processes particularly in the context of secondary organ toxicity. While interfacing of liver models with other organ models has been achieved, platforms that effectively provide human-relevant precise information are needed. In this concise review, we discuss the current state-of-the-art of liver-based multiorgan cell culture platforms primarily from a drug and metabolite perspective, and highlight the importance of media-to-cell ratio in interfacing liver models with other organ models. In addition, we briefly discuss issues related to development of optimal liver models that include recent advances in hepatic cell lines, stem cells, and challenges associated with primary hepatocyte-based liver models. Liver-based multiorgan models that achieve physiologically relevant coupling of different organ models can have a broad impact in evaluating drug efficacy and toxicity, as well as mechanistic investigation of human-relevant disease conditions. PMID:27049038

  16. Silkworm: A Promising Model Organism in Life Science.

    PubMed

    Meng, Xu; Zhu, Feifei; Chen, Keping

    2017-09-01

    As an important economic insect, silkworm Bombyx mori (L.) (Lepidoptera: Bombycidae) has numerous advantages in life science, such as low breeding cost, large progeny size, short generation time, and clear genetic background. Additionally, there are rich genetic resources associated with silkworms. The completion of the silkworm genome has further accelerated it to be a modern model organism in life science. Genomic studies showed that some silkworm genes are highly homologous to certain genes related to human hereditary disease and, therefore, are a candidate model for studying human disease. In this article, we provided a review of silkworm as an important model in various research areas, including human disease, screening of antimicrobial agents, environmental safety monitoring, and antitumor studies. In addition, the application potentiality of silkworm model in life sciences was discussed. © The Author 2017. Published by Oxford University Press on behalf of Entomological Society of America.

  17. Cannabinoids inhibit neurodegeneration in models of multiple sclerosis.

    PubMed

    Pryce, Gareth; Ahmed, Zubair; Hankey, Deborah J R; Jackson, Samuel J; Croxford, J Ludovic; Pocock, Jennifer M; Ledent, Catherine; Petzold, Axel; Thompson, Alan J; Giovannoni, Gavin; Cuzner, M Louise; Baker, David

    2003-10-01

    Multiple sclerosis is increasingly being recognized as a neurodegenerative disease that is triggered by inflammatory attack of the CNS. As yet there is no satisfactory treatment. Using experimental allergic encephalo myelitis (EAE), an animal model of multiple sclerosis, we demonstrate that the cannabinoid system is neuroprotective during EAE. Mice deficient in the cannabinoid receptor CB1 tolerate inflammatory and excitotoxic insults poorly and develop substantial neurodegeneration following immune attack in EAE. In addition, exogenous CB1 agonists can provide significant neuroprotection from the consequences of inflammatory CNS disease in an experimental allergic uveitis model. Therefore, in addition to symptom management, cannabis may also slow the neurodegenerative processes that ultimately lead to chronic disability in multiple sclerosis and probably other diseases.

  18. Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Dupnick, E.; Wiggins, D.

    1980-01-01

    An interactive computer program for automatically generating traffic models for the Space Transportation System (STS) is presented. Information concerning run stream construction, input data, and output data is provided. The flow of the interactive data stream is described. Error messages are specified, along with suggestions for remedial action. In addition, formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.

  19. The disconnected values (intervention) model for promoting healthy habits in religious institutions.

    PubMed

    Anshel, Mark H

    2010-03-01

    The purpose of this article is to provide an intervention model that can be used by religious leaders for changing health behavior among practicing members of religious communities. The intervention does not require extensive training or licensure in counseling psychology. At the heart of this model is the acknowledgement that a person's negative habits (e.g., lack of exercise, poor nutrition) and his or her deepest values and beliefs (e.g., faith, health, family) are often misaligned, or disconnected. In addition, the unhealthy outcomes from these habits are contrary to the scriptural traditions of the world religions and thus are especially relevant to individuals who practice their religious beliefs. The Sacred Scriptures of Judaism and Christianity, for example, are replete with teachings that extol the virtues of practicing habits that promote good health and energy. In addition, evidence is mounting in the existing health intervention literature that adopting permanent and desirable changes in health behavior have not been successful, and that adherence to desirable habits such as exercise and proper nutrition is short-lived. The Disconnected Values Model (DVM) provides a novel approach for enhancing health behavior change within the context of the mission of most religious institutions. The model is compatible with skills presented by religious leaders, who possess more credibility and influence in changing the behavior of members and service attendees of their respective religious institutions. The religious leader's role is to provide the client with faith-based incentives to initiate and maintain changes in their health behaviors, and perhaps to provide resources for the individual to pursue an action plan. A case study is described in which the DVM intervention was used successfully with an individual of strong faith.

  20. C-5M Super Galaxy Utilization with Joint Precision Airdrop System

    DTIC Science & Technology

    2012-03-22

    System Notes FireFly 900-2,200 Steerable Parafoil Screamer 500-2,200 Steerable Parafoil w/additional chutes to slow touchdown Dragonfly...setting . This initial feasible solution provides the Nonlinear Program algorithm a starting point to continue its calculations. The model continues...provides the NLP with a starting point of 1. This provides the NLP algorithm a point within the feasible region to begin its calculations in an attempt

  1. 49 CFR 580.14 - Power of attorney to review title documents and acknowledge disclosure.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... transferee's name and current address; and (5) The identity of the vehicle, including its make, model year, body type and vehicle identification number. (c) In addition to the information provided under...

  2. 49 CFR 580.14 - Power of attorney to review title documents and acknowledge disclosure.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... transferee's name and current address; and (5) The identity of the vehicle, including its make, model year, body type and vehicle identification number. (c) In addition to the information provided under...

  3. 49 CFR 580.14 - Power of attorney to review title documents and acknowledge disclosure.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... transferee's name and current address; and (5) The identity of the vehicle, including its make, model year, body type and vehicle identification number. (c) In addition to the information provided under...

  4. 49 CFR 580.14 - Power of attorney to review title documents and acknowledge disclosure.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... transferee's name and current address; and (5) The identity of the vehicle, including its make, model year, body type and vehicle identification number. (c) In addition to the information provided under...

  5. 77 FR 29645 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... determine incentive payment levels to participating physician group practices participating in the PGP-TD. In addition, this data will be used to evaluate the effectiveness of these payment models and provide...

  6. Angular Distribution of the X-ray Reflection in Accretion Disks

    NASA Astrophysics Data System (ADS)

    Garcia, Javier; Dauser, T.; Lohfink, A. M.; Kallman, T. R.; McClintock, J. E.; Steiner, J. F.; Brenneman, L.; Wilms, J.; Reynolds, C. S.; Tombesi, F.

    2014-01-01

    For the study of black holes, it is essential to have an accurate disk-reflection model with a proper treatment of the relativistic effects that occur near strong gravitational fields. These models are used to constrain the properties of the disk, including its inner radius, the degree of ionization of the gas, and the elemental abundances. Importantly, reflection models are the key to measuring black hole spin via the Fe-line method. However, most current reflection models only provide an angle-averaged solution for the flux reflected at the surface of the disk, which can systematically affect the inferred disk emission. We overcome this limitation by exploiting the full capabilities of our reflection code XILLVER. The solution of the reflected intensity of the radiation field is calculated for each photon energy, position in the slab, and viewing angle. We use this information to construct a grid of reflection models in which the inclination of the system is included as a free fitting parameter. Additionally, we directly connect the angle-resolved XILLVER model with the relativistic blurring code RELLINE to produce a self-consistent numerical model for to angular distribution of the reflected X-ray spectra from ionized accretion disks around black holes. The new model, RELCONV_XILL, is provided in the appropriate format to be used in combination with the commonly used fitting packages. An additional version of the new model, RELCONV_LP_XILL, which simulates the reflected spectra in a lampost scenario, is also supplied.

  7. An automated data management/analysis system for space shuttle orbiter tiles. [stress analysis

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Ballas, M.

    1982-01-01

    An engineering data management system was combined with a nonlinear stress analysis program to provide a capability for analyzing a large number of tiles on the space shuttle orbiter. Tile geometry data and all data necessary of define the tile loads environment accessed automatically as needed for the analysis of a particular tile or a set of tiles. User documentation provided includes: (1) description of computer programs and data files contained in the system; (2) definitions of all engineering data stored in the data base; (3) characteristics of the tile anaytical model; (4) instructions for preparation of user input; and (5) a sample problem to illustrate use of the system. Description of data, computer programs, and analytical models of the tile are sufficiently detailed to guide extension of the system to include additional zones of tiles and/or additional types of analyses

  8. Broadening perspectives on pediatric oral health care provision: social determinants of health and behavioral management.

    PubMed

    Fisher-Owens, Susan

    2014-01-01

    Dental caries is not just the most common chronic childhood disease, with not insignificant burden of disease during childhood, but also lifelong impact. Traditional models that focus on the "mouth in the chair" have been helpful but insufficient to identify structural root causes for its high incidence, thus having a limited ability to prevent the disease. The addition of social and behavioral determinants to strictly biologic models provides the full context of care, enabling providers to better tailor their guidance and improve health outcomes. In-office behavioral management involves understanding these determinants and applying appropriate techniques; these not only can help reset family and patient expectations but can actually increase compliance. Lastly, children with multiple medical issues require additional focus, as they can carry greater burden of disease, making it even more critical during office visits to offer multifactorial compliance strategies for these patients and their parents.

  9. Imperfection and radiation damage in protein crystals studied with coherent radiation

    PubMed Central

    Nave, Colin; Sutton, Geoff; Evans, Gwyndaf; Owen, Robin; Rau, Christoph; Robinson, Ian; Stuart, David Ian

    2016-01-01

    Fringes and speckles occur within diffraction spots when a crystal is illuminated with coherent radiation during X-ray diffraction. The additional information in these features provides insight into the imperfections in the crystal at the sub-micrometre scale. In addition, these features can provide more accurate intensity measurements (e.g. by model-based profile fitting), detwinning (by distinguishing the various components), phasing (by exploiting sampling of the molecular transform) and refinement (by distinguishing regions with different unit-cell parameters). In order to exploit these potential benefits, the features due to coherent diffraction have to be recorded and any change due to radiation damage properly modelled. Initial results from recording coherent diffraction at cryotemperatures from polyhedrin crystals of approximately 2 µm in size are described. These measurements allowed information about the type of crystal imperfections to be obtained at the sub-micrometre level, together with the changes due to radiation damage. PMID:26698068

  10. Simulation of Attacks for Security in Wireless Sensor Network

    PubMed Central

    Diaz, Alvaro; Sanchez, Pablo

    2016-01-01

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710

  11. Secluded and putative flipped dark matter and Stueckelberg extensions of the standard model

    NASA Astrophysics Data System (ADS)

    Fortes, E. C. F. S.; Pleitez, V.; Stecker, F. W.

    2018-02-01

    We consider here three dark matter models with the gauge symmetry of the standard model plus an additional local U(1)D factor. One model is truly secluded and the other two models begin flipped, but end up secluded. All of these models include one dark fermion and one vector boson that gains mass via the Stueckelberg mechanism. We show that the would be flipped models provide an example dark matter composed of "almost least interacting particles" (ALIPs). Such particles are therefore compatible with the constraints obtained from both laboratory measurements and astrophysical observations.

  12. Secluded and Putative Flipped Dark Matter and Stueckelberg Extensions of the Standard Model

    NASA Technical Reports Server (NTRS)

    Fortes, E. C. F. S.; Pleitez, V.; Stecker, F. W.

    2018-01-01

    We consider here three dark matter models with the gauge symmetry of the standard model plus an additional local U(1)D factor. One model is truly secluded and the other two models begin flipped, but end up secluded. All of these models include one dark fermion and one vector boson that gains mass via the Stueckelberg mechanism. We show that the would be flipped models provide an example dark matter composed of "almost least interacting particles" (ALIPs). Such particles are therefore compatible with the constraints obtained from both laboratory measurements and astrophysical observations.

  13. Formal Analysis of Self-Efficacy in Job Interviewee’s Mental State Model

    NASA Astrophysics Data System (ADS)

    Ajoge, N. S.; Aziz, A. A.; Yusof, S. A. Mohd

    2017-08-01

    This paper presents a formal analysis approach for self-efficacy model of interviewee’s mental state during a job interview session. Self-efficacy is a construct that has been hypothesised to combine with motivation and interviewee anxiety to define state influence of interviewees. The conceptual model was built based on psychological theories and models related to self-efficacy. A number of well-known relations between events and the course of self-efficacy are summarized from the literature and it is shown that the proposed model exhibits those patterns. In addition, this formal model has been mathematically analysed to find out which stable situations exist. Finally, it is pointed out how this model can be used in a software agent or robot-based platform. Such platform can provide an interview coaching approach where support to the user is provided based on their individual metal state during interview sessions.

  14. Generalised additive modelling approach to the fermentation process of glutamate.

    PubMed

    Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping

    2011-03-01

    In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  15. Initial Verification of GEOS-4 Aerosols Using CALIPSO and MODIS: Scene Classification

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Colarco, Peter R.; Hlavka, Dennis; Levy, Robert C.; Vaughan, Mark A.; daSilva, Arlindo

    2007-01-01

    A-train sensors such as MODIS and MISR provide column aerosol properties, and in the process a means of estimating aerosol type (e.g. smoke vs. dust). Correct classification of aerosol type is important because retrievals are often dependent upon selection of the right aerosol model. In addition, aerosol scene classification helps place the retrieved products in context for comparisons and analysis with aerosol transport models. The recent addition of CALIPSO to the A-train now provides a means of classifying aerosol distribution with altitude. CALIPSO level 1 products include profiles of attenuated backscatter at 532 and 1064 nm, and depolarization at 532 nm. Backscatter intensity, wavelength ratio, and depolarization provide information on the vertical profile of aerosol concentration, size, and shape. Thus similar estimates of aerosol type using MODIS or MISR are possible with CALIPSO, and the combination of data from all sensors provides a means of 3D aerosol scene classification. The NASA Goddard Earth Observing System general circulation model and data assimilation system (GEOS-4) provides global 3D aerosol mass for sulfate, sea salt, dust, and black and organic carbon. A GEOS-4 aerosol scene classification algorithm has been developed to provide estimates of aerosol mixtures along the flight track for NASA's Geoscience Laser Altimeter System (GLAS) satellite lidar. GLAS launched in 2003 and did not have the benefit of depolarization measurements or other sensors from the A-train. Aerosol typing from GLAS data alone was not possible, and the GEOS-4 aerosol classifier has been used to identify aerosol type and improve the retrieval of GLAS products. Here we compare 3D aerosol scene classification using CALIPSO and MODIS with the GEOS-4 aerosol classifier. Dust, smoke, and pollution examples will be discussed in the context of providing an initial verification of the 3D GEOS-4 aerosol products. Prior model verification has only been attempted with surface mass comparisons and column optical depth from AERONET and MODIS.

  16. Documentation of the Goddard Laboratory for atmospheres fourth-order two-layer shallow water model

    NASA Technical Reports Server (NTRS)

    Takacs, L. L. (Compiler)

    1986-01-01

    The theory and numerical treatment used in the 2-level GLA fourth-order shallow water model are described. This model was designed to emulate the horizontal finite differences used by the GLA Fourth-Order General Circulation Model (Kalnay et al., 1983) in addition to its grid structure, form of high-latitude and global filtering, and time-integration schemes. A user's guide is also provided instructing the user on how to create initial conditions, execute the model, and post-process the data history.

  17. A Generalized Model of E-trading for GSR Fair Exchange Protocol

    NASA Astrophysics Data System (ADS)

    Konar, Debajyoti; Mazumdar, Chandan

    In this paper we propose a generalized model of E-trading for the development of GSR Fair Exchange Protocols. Based on the model, a method is narrated to implement E-trading protocols that ensure fairness in true sense without using an additional trusted third party for which either party has to pay. The model provides the scope to include the correctness of the product, money atomicity and customer's anonymity properties within E-trading protocol. We conclude this paper by indicating the area of applicability for our model.

  18. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  19. More-Realistic Digital Modeling of a Human Body

    NASA Technical Reports Server (NTRS)

    Rogge, Renee

    2010-01-01

    A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.

  20. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  1. Stationarity is undead: Uncertainty dominates the distribution of extremes

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.

    2015-03-01

    The increasing effort to develop and apply nonstationary models in hydrologic frequency analyses under changing environmental conditions can be frustrated when the additional uncertainty related to the model complexity is accounted for along with the sampling uncertainty. In order to show the practical implications and possible problems of using nonstationary models and provide critical guidelines, in this study we review the main tools developed in this field (such as nonstationary distribution functions, return periods, and risk of failure) highlighting advantages and disadvantages. The discussion is supported by three case studies that revise three illustrative examples reported in the scientific and technical literature referring to the Little Sugar Creek (at Charlotte, North Carolina), Red River of the North (North Dakota/Minnesota), and the Assunpink Creek (at Trenton, New Jersey). The uncertainty of the results is assessed by complementing point estimates with confidence intervals (CIs) and emphasizing critical aspects such as the subjectivity affecting the choice of the models' structure. Our results show that (1) nonstationary frequency analyses should not only be based on at-site time series but require additional information and detailed exploratory data analyses (EDA); (2) as nonstationary models imply that the time-varying model structure holds true for the entire future design life period, an appropriate modeling strategy requires that EDA identifies a well-defined deterministic mechanism leading the examined process; (3) when the model structure cannot be inferred in a deductive manner and nonstationary models are fitted by inductive inference, model structure introduces an additional source of uncertainty so that the resulting nonstationary models can provide no practical enhancement of the credibility and accuracy of the predicted extreme quantiles, whereas possible model misspecification can easily lead to physically inconsistent results; (4) when the model structure is uncertain, stationary models and a suitable assessment of the uncertainty accounting for possible temporal persistence should be retained as more theoretically coherent and reliable options for practical applications in real-world design and management problems; (5) a clear understanding of the actual probabilistic meaning of stationary and nonstationary return periods and risk of failure is required for a correct risk assessment and communication.

  2. Bayesian Hierarchical Air-Sea Interaction Modeling: Application to the Labrador Sea

    NASA Technical Reports Server (NTRS)

    Niiler, Pearn P.

    2002-01-01

    The objectives are to: 1) Organize data from 26 MINIMET drifters in the Labrador Sea, including sensor calibration and error checking of ARGOS transmissions. 2) Produce wind direction, barometer, and sea surface temperature time series. In addition, provide data from historical file of 150 SHARP drifters in the Labrador Sea. 3) Work with data interpretation and data-modeling assimilation issues.

  3. Flight dynamics analysis and simulation of heavy lift airships. Volume 2: Technical manual

    NASA Technical Reports Server (NTRS)

    Ringland, R. F.; Tischler, M. B.; Jex, H. R.; Emmen, R. D.; Ashkenas, I. L.

    1982-01-01

    The mathematical models embodied in the simulation are described in considerable detail and with supporting evidence for the model forms chosen. In addition the trimming and linearization algorithms used in the simulation are described. Appendices to the manual identify reference material for estimating the needed coefficients for the input data and provide example simulation results.

  4. Characterization of structural connections for multicomponent systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Huckelbridge, Arthur A.

    1988-01-01

    This study explores combining Component Mode Synthesis methods for coupling structural components with Parameter Identification procedures for improving the analytical modeling of the connections. Improvements in the connection stiffness and damping properties are computed in terms of physical parameters so that the physical characteristics of the connections can be better understood, in addition to providing improved input for the system model.

  5. Thermal Vegetation Canopy Model Studies.

    DTIC Science & Technology

    1981-08-01

    optical and thermal canopy radiation models, and the interpretation of these measurements. Previous technical re- ports in this series have described...The initial guess is taken to be air temperature; thus, the solution approach may be interpreted as determining the modification to the air...provided assistance for interpreting the micrometeorological data. In addition, Dr. L. W. Gay of the School of Renewable Natural Resources, Arizona

  6. Predictive Models of Acute Mountain Sickness after Rapid Ascent to Various Altitudes

    DTIC Science & Technology

    2013-01-01

    unclassified relational mountain medicine database containing individ- ual ascent profiles, demographic and physiologic subject descriptors, and...course of AMS, and define the baseline demographics and physiologic descriptors that increase the risk of AMS. In addition, these models provide...substantiated this finding in un- acclimatized women (24). Other physiologic differences between men and women (i.e., differences in endothelial

  7. NREL Leads Wind Farm Modeling Research - Continuum Magazine | NREL

    Science.gov Websites

    ten 2-MW Bonus wind turbines. Photo provided by HC Sorensen, Middelgrunden Wind Turbine Cooperative ) has created complex computer modeling tools to improve wind turbine design and overall wind farm activity surrounding a multi-megawatt wind turbine. In addition to its work with Doppler LIDAR, the

  8. FASCODE - Fast Atmospheric Signature Code (Spectral Transmittance and Radiance)

    DTIC Science & Technology

    1978-01-16

    cA6 Approvued o ulcrlae distrnibuin nimtd I? OSTIUTIN TAEMNT(. te ~FrAc1aieW iRD ADDRES 10- iiON 1diE$@NT.t haul fl TASK VIS. dn IPLM nTAR viNOT...the Voigt line shape profile model discussed above. In addition, the Voigt model provides a proper treatment of the transition region at those

  9. From the Model Minority to the Invisible Minority: Asian & Pacific American Students in Higher Education Research. AIR 1998 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Rohrlick, Jeffrey; Alvarado, Diana; Zaruba, Karen; Kallio, Ruth

    The paper provides an overview of research on Asian and Pacific American (APA) undergraduates at U.S. institutions, focusing on the origins, assumptions, and fallacies of the "model minority" image. In addition, it offers highlights from a recent campus survey that suggests that APA students perceive their university experience…

  10. Fuzzy Logic as a Tool for Assessing Students' Knowledge and Skills

    ERIC Educational Resources Information Center

    Voskoglou, Michael Gr.

    2013-01-01

    Fuzzy logic, which is based on fuzzy sets theory introduced by Zadeh in 1965, provides a rich and meaningful addition to standard logic. The applications which may be generated from or adapted to fuzzy logic are wide-ranging and provide the opportunity for modeling under conditions which are imprecisely defined. In this article we develop a fuzzy…

  11. Collaboration using roles. [in computer network security

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1990-01-01

    Segregation of roles into alternative accounts is a model which provides not only the ability to collaborate but also enables accurate accounting of resources consumed by collaborative projects, protects the resources and objects of such a project, and does not introduce new security vulnerabilities. The implementation presented here does not require users to remember additional passwords and provides a very simple consistent interface.

  12. Basin-fill Aquifer Modeling with Terrestrial Gravity: Assessing Static Offsets in Bulk Datasets using MATLAB; Case Study of Bridgeport, CA

    NASA Astrophysics Data System (ADS)

    Mlawsky, E. T.; Louie, J. N.; Pohll, G.; Carlson, C. W.; Blakely, R. J.

    2015-12-01

    Understanding the potential availability of water resources in Eastern California aquifers is of critical importance to making water management policy decisions and determining best-use practices for California, as well as for downstream use in Nevada. Hydrologic well log data can provide valuable information on aquifer capacity, but is often proprietarily inaccessible or economically unfeasible to obtain in sufficient quantity. In the case of basin-fill aquifers, it is possible to make estimates of aquifer geometry and volume using geophysical surveys of gravity, constrained by additional geophysical and geological observations. We use terrestrial gravity data to model depth-to-basement about the Bridgeport, CA basin for application in preserving the Walker Lake biome. In constructing the model, we assess several hundred gravity observations, existing and newly collected. We regard these datasets as "bulk," as the data are compiled from multiple sources. Inconsistencies among datasets can result in "static offsets," or artificial bull's-eye contours, within the gradient. Amending suspect offsets requires the attention of the modeler; picking these offsets by hand can be a time-consuming process when modeling large-scale basin features. We develop a MATLAB script for interpolating the residual Bouguer anomaly about the basin using sparse observation points, and leveling offset points with a user-defined sensitivity. The script is also capable of plotting gravity profiles between any two endpoints within the map extent. The resulting anomaly map provides an efficient means of locating and removing static offsets in the data, while also providing a fast visual representation of a bulk dataset. Additionally, we obtain gridded basin gravity models with an open-source alternative to proprietary modeling tools.

  13. Exploratory structural equation modeling of personality data.

    PubMed

    Booth, Tom; Hughes, David J

    2014-06-01

    The current article compares the use of exploratory structural equation modeling (ESEM) as an alternative to confirmatory factor analytic (CFA) models in personality research. We compare model fit, factor distinctiveness, and criterion associations of factors derived from ESEM and CFA models. In Sample 1 (n = 336) participants completed the NEO-FFI, the Trait Emotional Intelligence Questionnaire-Short Form, and the Creative Domains Questionnaire. In Sample 2 (n = 425) participants completed the Big Five Inventory and the depression and anxiety scales of the General Health Questionnaire. ESEM models provided better fit than CFA models, but ESEM solutions did not uniformly meet cutoff criteria for model fit. Factor scores derived from ESEM and CFA models correlated highly (.91 to .99), suggesting the additional factor loadings within the ESEM model add little in defining latent factor content. Lastly, criterion associations of each personality factor in CFA and ESEM models were near identical in both inventories. We provide an example of how ESEM and CFA might be used together in improving personality assessment. © The Author(s) 2014.

  14. Evolution of solidification texture during additive manufacturing.

    PubMed

    Wei, H L; Mazumder, J; DebRoy, T

    2015-11-10

    Striking differences in the solidification textures of a nickel based alloy owing to changes in laser scanning pattern during additive manufacturing are examined based on theory and experimental data. Understanding and controlling texture are important because it affects mechanical and chemical properties. Solidification texture depends on the local heat flow directions and competitive grain growth in one of the six <100> preferred growth directions in face centered cubic alloys. Therefore, the heat flow directions are examined for various laser beam scanning patterns based on numerical modeling of heat transfer and fluid flow in three dimensions. Here we show that numerical modeling can not only provide a deeper understanding of the solidification growth patterns during the additive manufacturing, it also serves as a basis for customizing solidification textures which are important for properties and performance of components.

  15. Multiple imaging mode X-ray computed tomography for distinguishing active and inactive phases in lithium-ion battery cathodes

    NASA Astrophysics Data System (ADS)

    Komini Babu, Siddharth; Mohamed, Alexander I.; Whitacre, Jay F.; Litster, Shawn

    2015-06-01

    This paper presents the use of nanometer scale resolution X-ray computed tomography (nano-CT) in the three-dimensional (3D) imaging of a Li-ion battery cathode, including the separate volumes of active material, binder plus conductive additive, and pore. The different high and low atomic number (Z) materials are distinguished by sequentially imaging the lithium cobalt oxide electrode in absorption and then Zernike phase contrast modes. Morphological parameters of the active material and the additives are extracted from the 3D reconstructions, including the distribution of contact areas between the additives and the active material. This method could provide a better understanding of the electric current distribution and structural integrity of battery electrodes, as well as provide detailed geometries for computational models.

  16. Animal Models of Lymphangioleiomyomatosis (LAM) and Tuberous Sclerosis Complex (TSC)

    PubMed Central

    2010-01-01

    Abstract Animal models of lymphangioleiomyomatosis (LAM) and tuberous sclerosis complex (TSC) are highly desired to enable detailed investigation of the pathogenesis of these diseases. Multiple rats and mice have been generated in which a mutation similar to that occurring in TSC patients is present in an allele of Tsc1 or Tsc2. Unfortunately, these mice do not develop pathologic lesions that match those seen in LAM or TSC. However, these Tsc rodent models have been useful in confirming the two-hit model of tumor development in TSC, and in providing systems in which therapeutic trials (e.g., rapamycin) can be performed. In addition, conditional alleles of both Tsc1 and Tsc2 have provided the opportunity to target loss of these genes to specific tissues and organs, to probe the in vivo function of these genes, and attempt to generate better models. Efforts to generate an authentic LAM model are impeded by a lack of understanding of the cell of origin of this process. However, ongoing studies provide hope that such a model will be generated in the coming years. PMID:20235887

  17. A model-Driven Approach to Customize the Vocabulary of Communication Boards: Towards More Humanization of Health Care.

    PubMed

    Franco, Natália M; Medeiros, Gabriel F; Silva, Edson A; Murta, Angela S; Machado, Aydano P; Fidalgo, Robson N

    2015-01-01

    This work presents a Modeling Language and its technological infrastructure to customize the vocabulary of Communication Boards (CB), which are important tools to provide more humanization of health care. Using a technological infrastructure based on Model-Driven Development (MDD) approach, our Modelin Language (ML) creates an abstraction layer between users (e.g., health professionals such as an audiologist or speech therapist) and application code. Moreover, the use of a metamodel enables a syntactic corrector for preventing creation of wrong models. Our ML and metamodel enable more autonomy for health professionals in creating customized CB because it abstracts complexities and permits them to deal only with the domain concepts (e.g., vocabulary and patient needs). Additionally, our infrastructure provides a configuration file that can be used to share and reuse models. This way, the vocabulary modelling effort will decrease our time since people share vocabulary models. Our study provides an infrastructure that aims to abstract the complexity of CB vocabulary customization, giving more autonomy to health professionals when they need customizing, sharing and reusing vocabularies for CB.

  18. What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.

    2012-12-01

    A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less

  19. Global Coupled Carbon and Nitrogen Models: Successes, Failures and What next?

    NASA Astrophysics Data System (ADS)

    Holland, E. A.

    2011-12-01

    Over the last few years, there has been a great deal of progress in modeling coupled terrestrial global carbon and nitrogen cycles and their roles in Earth System models. The collection of recent models provides some surprising results and insights. A critical question for Earth system models is: How do the coupled C/N model results impact atmospheric carbon dioxide concentrations compared to carbon only models? Some coupled models predict increased atmospheric carbon dioxide concentrations, the result expected from nitrogen-limited photosynthesis uptake of carbon dioxide, while others predict little change or decreased carbon dioxide uptake with a coupled carbon and nitrogen cycle. With this range of impacts for climate critical atmospheric carbon dioxide concentrations, there is clearly a need for additional comparison of measurements and models. Randerson et al.'s CLAMP study provided important constraints and comparison for primarily for aboveground carbon uptake. However, nitrogen supply is largely determined decomposition and soil processes. I will present comparisons of NCAR's CESM results with soil and litter carbon and nitrogen fluxes and standing stocks. These belowground data sets of both carbon and nitrogen provide important benchmarks for coupled C/N models.

  20. Heliophysics Data and Modeling Research Using VSPO

    NASA Technical Reports Server (NTRS)

    Roberts, D. Aaron; Hesse, Michael; Cornwell, Carl

    2007-01-01

    The primary advantage of Virtual Observatories in scientific research is efficiency: rapid searches for and access to data in convenient forms makes it possible to explore scientific questions without spending days or weeks on ancilary tasks. The Virtual Space Physics Observatory provides a general portal to Heliophysics data for this task. Here we will illustrate the advantages of the VO approach by examining specific geomagnetically active times and tracing the activity through the Sun-Earth system. In addition to previous and additional data sources, we will demonstrate an extension of the capabilities to allow searching for model run results from the range of CCMC models. This approach allows the user to quickly compare models and observations at a qualitative level; considerably more work will be needed to develop more seamless connections to data streams and the equivalent numerical output from simulations.

  1. Application of the two-stage clonal expansion model in characterizing the joint effect of exposure to two carcinogens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zielinski, J.M.; Krewski, D.

    1992-12-31

    In this paper, we describe application of the two-stage clonal expansion model to characterize the joint effect of exposure to two carcinogens. This biologically based model of carcinogenesis provides a useful framework for the quantitative description of carcinogenic risks and for defining agents that act as initiators, promoters, and completers. Depending on the mechanism of action, the agent-specific relative risk following exposure to two carcinogens can be additive, multiplicative, or supramultiplicative, with supra-additive relative risk indicating a synergistic effect between the two agents. Maximum-likelihood methods for fitting the two-stage clonal expansion model with intermittent exposure to two carcinogens are describedmore » and illustrated, using data on lung-cancer mortality among Colorado uranium miners exposed to both radon and tobacco smoke.« less

  2. Sport commitment among competitive female athletes: test of an expanded model.

    PubMed

    Weiss, Windee M; Weiss, Maureen R; Amorose, Anthony J

    2010-02-01

    In the present study, we examined an expanded model of sport commitment by adding two determinants (perceived costs and perceived competence) and behavioural commitment as a consequence of psychological commitment, as well as identifying psychological commitment as a mediator of relationships between determinants and behavioural commitment. Competitive female gymnasts (N = 304, age 8-18 years) completed relevant measures while coaches rated each gymnast's training behaviours as an indicator of behavioural commitment. Path analysis revealed that the best fitting model was one in which original determinants (enjoyment, involvement opportunities, investments, attractive alternatives) and an added determinant (perceived costs) predicted psychological commitment, in addition to investments and perceived costs directly predicting behavioural commitment. These results provide further, but partial, support for the sport commitment model and also suggest that additional determinants and behavioural consequences be considered in future research.

  3. Space Shuttle propulsion parameter estimation using optimal estimation techniques

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This fourth monthly progress report again contains corrections and additions to the previously submitted reports. The additions include a simplified SRB model that is directly incorporated into the estimation algorithm and provides the required partial derivatives. The resulting partial derivatives are analytical rather than numerical as would be the case using the SOBER routines. The filter and smoother routine developments have continued. These routines are being checked out.

  4. The utility of atmospheric analyses for the mitigation of artifacts in InSAR

    USGS Publications Warehouse

    Foster, James; Kealy, John; Cherubini, Tiziana; Businger, S.; Lu, Zhong; Murphy, Michael

    2013-01-01

    The numerical weather models (NWMs) developed by the meteorological community are able to provide accurate analyses of the current state of the atmosphere in addition to the predictions of the future state. To date, most attempts to apply the NWMs to estimate the refractivity of the atmosphere at the time of satellite synthetic aperture radar (SAR) data acquisitions have relied on predictive models. We test the hypothesis that performing a final assimilative routine, ingesting all available meteorological observations for the times of SAR acquisitions, and generating customized analyses of the atmosphere at those times will better mitigate atmospheric artifacts in differential interferograms. We find that, for our study area around Mount St. Helens (Amboy, Washington, USA), this approach is unable to model the refractive changes and provides no mean benefit for interferogram analysis. The performance is improved slightly by ingesting atmospheric delay estimates derived from the limited local GPS network; however, the addition of water vapor products from the GOES satellites reduces the quality of the corrections. We interpret our results to indicate that, even with this advanced approach, NWMs are not a reliable mitigation technique for regions such as Mount St. Helens with highly variable moisture fields and complex topography and atmospheric dynamics. It is possible, however, that the addition of more spatially dense meteorological data to constrain the analyses might significantly improve the performance of weather modeling of atmospheric artifacts in satellite radar interferograms.

  5. Building an Efficient Model for Afterburn Energy Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alves, S; Kuhl, A; Najjar, F

    2012-02-03

    Many explosives will release additional energy after detonation as the detonation products mix with the ambient environment. This additional energy release, referred to as afterburn, is due to combustion of undetonated fuel with ambient oxygen. While the detonation energy release occurs on a time scale of microseconds, the afterburn energy release occurs on a time scale of milliseconds with a potentially varying energy release rate depending upon the local temperature and pressure. This afterburn energy release is not accounted for in typical equations of state, such as the Jones-Wilkins-Lee (JWL) model, used for modeling the detonation of explosives. Here wemore » construct a straightforward and efficient approach, based on experiments and theory, to account for this additional energy release in a way that is tractable for large finite element fluid-structure problems. Barometric calorimeter experiments have been executed in both nitrogen and air environments to investigate the characteristics of afterburn for C-4 and other materials. These tests, which provide pressure time histories, along with theoretical and analytical solutions provide an engineering basis for modeling afterburn with numerical hydrocodes. It is toward this end that we have constructed a modified JWL equation of state to account for afterburn effects on the response of structures to blast. The modified equation of state includes a two phase afterburn energy release to represent variations in the energy release rate and an afterburn energy cutoff to account for partial reaction of the undetonated fuel.« less

  6. Value for money? Array genomic hybridization for diagnostic testing for genetic causes of intellectual disability.

    PubMed

    Regier, Dean A; Friedman, Jan M; Marra, Carlo A

    2010-05-14

    Array genomic hybridization (AGH) provides a higher detection rate than does conventional cytogenetic testing when searching for chromosomal imbalance causing intellectual disability (ID). AGH is more costly than conventional cytogenetic testing, and it remains unclear whether AGH provides good value for money. Decision analytic modeling was used to evaluate the trade-off between costs, clinical effectiveness, and benefit of an AGH testing strategy compared to a conventional testing strategy. The trade-off between cost and effectiveness was expressed via the incremental cost-effectiveness ratio. Probabilistic sensitivity analysis was performed via Monte Carlo simulation. The baseline AGH testing strategy led to an average cost increase of $217 (95% CI $172-$261) per patient and an additional 8.2 diagnoses in every 100 tested (0.082; 95% CI 0.044-0.119). The mean incremental cost per additional diagnosis was $2646 (95% CI $1619-$5296). Probabilistic sensitivity analysis demonstrated that there was a 95% probability that AGH would be cost effective if decision makers were willing to pay $4550 for an additional diagnosis. Our model suggests that using AGH instead of conventional karyotyping for most ID patients provides good value for money. Deterministic sensitivity analysis found that employing AGH after first-line cytogenetic testing had proven uninformative did not provide good value for money when compared to using AGH as first-line testing. Copyright (c) 2010 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  7. The population benefit of evidence-based radiotherapy: 5-Year local control and overall survival benefits.

    PubMed

    Hanna, T P; Shafiq, J; Delaney, G P; Vinod, S K; Thompson, S R; Barton, M B

    2018-02-01

    To describe the population benefit of radiotherapy in a high-income setting if evidence-based guidelines were routinely followed. Australian decision tree models were utilized. Radiotherapy alone (RT) benefit was defined as the absolute proportional benefit of radiotherapy compared with no treatment for radical indications, and of radiotherapy over surgery alone for adjuvant indications. Chemoradiotherapy (CRT) benefit was the absolute incremental benefit of concurrent chemoradiotherapy over RT. Five-year local control (LC) and overall survival (OS) benefits were measured. Citation databases were systematically queried for benefit data. Meta-analysis and sensitivity analysis were performed. 48% of all cancer patients have indications for radiotherapy, 34% curative and 14% palliative. RT provides 5-year LC benefit in 10.4% of all cancer patients (95% Confidence Interval 9.3, 11.8) and 5-year OS benefit in 2.4% (2.1, 2.7). CRT provides 5-year LC benefit in an additional 0.6% of all cancer patients (0.5, 0.6), and 5-year OS benefit for an additional 0.3% (0.2, 0.4). RT benefit was greatest for head and neck (LC 32%, OS 16%), and cervix (LC 33%, OS 18%). CRT LC benefit was greatest for rectum (6%) and OS for cervix (3%) and brain (3%). Sensitivity analysis confirmed a robust model. Radiotherapy provides significant 5-year LC and OS benefits as part of evidence-based cancer care. CRT provides modest additional benefits. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  8. A poroelastic model describing nutrient transport and cell stresses within a cyclically strained collagen hydrogel.

    PubMed

    Vaughan, Benjamin L; Galie, Peter A; Stegemann, Jan P; Grotberg, James B

    2013-11-05

    In the creation of engineered tissue constructs, the successful transport of nutrients and oxygen to the contained cells is a significant challenge. In highly porous scaffolds subject to cyclic strain, the mechanical deformations can induce substantial fluid pressure gradients, which affect the transport of solutes. In this article, we describe a poroelastic model to predict the solid and fluid mechanics of a highly porous hydrogel subject to cyclic strain. The model was validated by matching the predicted penetration of a bead into the hydrogel from the model with experimental observations and provides insight into nutrient transport. Additionally, the model provides estimates of the wall-shear stresses experienced by the cells embedded within the scaffold. These results provide insight into the mechanics of and convective nutrient transport within a cyclically strained hydrogel, which could lead to the improved design of engineered tissues. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. BPS sectors of the Skyrme model and their non-BPS extensions

    NASA Astrophysics Data System (ADS)

    Adam, C.; Foster, D.; Krusch, S.; Wereszczynski, A.

    2018-02-01

    Two recently found coupled Bogomol'nyi-Prasad-Sommerfield (BPS) submodels of the Skyrme model are further analyzed. First, we provide a geometrical formulation of the submodels in terms of the eigenvalues of the strain tensor. Second, we study their thermodynamical properties and show that the mean-field equations of state coincide at high pressure and read p =ρ ¯/3 . We also provide evidence that matter described by the first BPS submodel has some similarity with a Bose-Einstein condensate. Moreover, we show that extending the second submodel to a non-BPS model by including certain additional terms of the full Skyrme model does not spoil the respective ansatz, leading to an ordinary differential equation for the profile of the Skymion, for any value of the topological charge. This allows for an almost analytical description of the properties of Skyrmions in this model. In particular, we analytically study the breaking and restoration of the BPS property. Finally, we provide an explanation of the success of the rational map ansatz.

  10. A marked correlation function for constraining modified gravity models

    NASA Astrophysics Data System (ADS)

    White, Martin

    2016-11-01

    Future large scale structure surveys will provide increasingly tight constraints on our cosmological model. These surveys will report results on the distance scale and growth rate of perturbations through measurements of Baryon Acoustic Oscillations and Redshift-Space Distortions. It is interesting to ask: what further analyses should become routine, so as to test as-yet-unknown models of cosmic acceleration? Models which aim to explain the accelerated expansion rate of the Universe by modifications to General Relativity often invoke screening mechanisms which can imprint a non-standard density dependence on their predictions. This suggests density-dependent clustering as a `generic' constraint. This paper argues that a density-marked correlation function provides a density-dependent statistic which is easy to compute and report and requires minimal additional infrastructure beyond what is routinely available to such survey analyses. We give one realization of this idea and study it using low order perturbation theory. We encourage groups developing modified gravity theories to see whether such statistics provide discriminatory power for their models.

  11. The Quantitative-MFG Test: A Linear Mixed Effect Model to Detect Maternal-Offspring Gene Interactions.

    PubMed

    Clark, Michelle M; Blangero, John; Dyer, Thomas D; Sobel, Eric M; Sinsheimer, Janet S

    2016-01-01

    Maternal-offspring gene interactions, aka maternal-fetal genotype (MFG) incompatibilities, are neglected in complex diseases and quantitative trait studies. They are implicated in birth to adult onset diseases but there are limited ways to investigate their influence on quantitative traits. We present the quantitative-MFG (QMFG) test, a linear mixed model where maternal and offspring genotypes are fixed effects and residual correlations between family members are random effects. The QMFG handles families of any size, common or general scenarios of MFG incompatibility, and additional covariates. We develop likelihood ratio tests (LRTs) and rapid score tests and show they provide correct inference. In addition, the LRT's alternative model provides unbiased parameter estimates. We show that testing the association of SNPs by fitting a standard model, which only considers the offspring genotypes, has very low power or can lead to incorrect conclusions. We also show that offspring genetic effects are missed if the MFG modeling assumptions are too restrictive. With genome-wide association study data from the San Antonio Family Heart Study, we demonstrate that the QMFG score test is an effective and rapid screening tool. The QMFG test therefore has important potential to identify pathways of complex diseases for which the genetic etiology remains to be discovered. © 2015 John Wiley & Sons Ltd/University College London.

  12. Genetic parameters for direct and maternal calving ease in Walloon dairy cattle based on linear and threshold models.

    PubMed

    Vanderick, S; Troch, T; Gillon, A; Glorieux, G; Gengler, N

    2014-12-01

    Calving ease scores from Holstein dairy cattle in the Walloon Region of Belgium were analysed using univariate linear and threshold animal models. Variance components and derived genetic parameters were estimated from a data set including 33,155 calving records. Included in the models were season, herd and sex of calf × age of dam classes × group of calvings interaction as fixed effects, herd × year of calving, maternal permanent environment and animal direct and maternal additive genetic as random effects. Models were fitted with the genetic correlation between direct and maternal additive genetic effects either estimated or constrained to zero. Direct heritability for calving ease was approximately 8% with linear models and approximately 12% with threshold models. Maternal heritabilities were approximately 2 and 4%, respectively. Genetic correlation between direct and maternal additive effects was found to be not significantly different from zero. Models were compared in terms of goodness of fit and predictive ability. Criteria of comparison such as mean squared error, correlation between observed and predicted calving ease scores as well as between estimated breeding values were estimated from 85,118 calving records. The results provided few differences between linear and threshold models even though correlations between estimated breeding values from subsets of data for sires with progeny from linear model were 17 and 23% greater for direct and maternal genetic effects, respectively, than from threshold model. For the purpose of genetic evaluation for calving ease in Walloon Holstein dairy cattle, the linear animal model without covariance between direct and maternal additive effects was found to be the best choice. © 2014 Blackwell Verlag GmbH.

  13. Stoichiometry and kinetics of the anaerobic ammonium oxidation (Anammox) with trace hydrazine addition.

    PubMed

    Yao, Zongbao; Lu, Peili; Zhang, Daijun; Wan, Xinyu; Li, Yulian; Peng, Shuchan

    2015-12-01

    Purpose of this study is to investigate the stoichiometry and kinetics of anaerobic ammonium oxidation (Anammox) with trace hydrazine addition. The stoichiometry was established based on the electron balance of Anammox process with trace N2H4 addition. The stoichiometric coefficients were determined by the proton consumption and the changes in substrates and products. It was found that trace N2H4 addition can increase the yield of Anammox bacteria (AnAOB) and reduce NO3(-) yield, which enhances the Anammox. Subsequently, kinetic model of Anammox with trace N2H4 addition was developed, and the parameters of the anaerobic degradation model of N2H4 were obtained for the first time. The maximum specific substrate utilization rate, half-saturation constant and inhibition constant of N2H4 were 25.09mgN/g VSS/d, 10.42mgN/L and 1393.88mgN/L, respectively. These kinetic parameters might provide important information for the engineering applications of Anammox with trace N2H4 addition. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. “AQMEII Status Update”

    EPA Science Inventory

    “AQMEII Status Update”This presentation provided an overview and status update of the Air Quality Model Evaluation International Initative (AQMEII) to participants of a workshop of the Task Force on Hemispheric Transport of Air Pollution (TF-HTAP) . In addition, the p...

  15. Maximum likelihood estimation of finite mixture model for economic data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  16. Energy and nutrient recovery from anaerobic treatment of organic wastes

    NASA Astrophysics Data System (ADS)

    Henrich, Christian-Dominik

    The objective of the research was to develop a complete systems design and predictive model framework of a series of linked processes capable of providing treatment of landfill leachate while simultaneously recovering nutrients and bioenergy from the waste inputs. This proposed process includes an "Ammonia Recovery Process" (ARP) consisting of: (1) ammonia de-sorption requiring leachate pH adjustment with lime or sodium hydroxide addition followed by, (2) ammonia re-absorption into a 6-molar sulfuric acid spray-tower followed by, (3) biological activated sludge treatment of soluble organic residuals (BOD) followed by, (4) high-rate algal post-treatment and finally, (5) an optional anaerobic digestion process for algal and bacterial biomass, and/or supplemental waste fermentation providing the potential for additional nutrient and energy recovery. In addition, the value provided by the waste treatment function of the overall processes, each of the sub-processes would provide valuable co-products offering potential GHG credit through direct fossil-fuel replacement, or replacement of products requiring fossil fuels. These valuable co-products include, (1) ammonium sulfate fertilizer, (2) bacterial biomass, (3) algal biomass providing, high-protein feeds and oils for biodiesel production and, (4) methane bio-fuels. Laboratory and pilot reactors were constructed and operated, providing data supporting the quantification and modeling of the ARP. Growth parameters, and stoichiometric coefficients were determined, allowing for design of the leachate activated sludge treatment sub-component. Laboratory and pilot algal reactors were constructed and operated, and provided data that supported the determination of leachate organic/inorganic-nitrogen ratio, and loading rates, allowing optimum performance of high-rate algal post-treatment. A modular and expandable computer program was developed, which provided a systems model framework capable of predicting individual component and overall performance. The overall systems model software, ENRAT, predicted that a full-scale operation to treat 18,750 L leachate/day would need an Ammonia Recovery process consisting of 88,300 L of total gas transfer column volume, an activated sludge system of 74,417 L, and an algal post treatment raceway of 683 m2 (30 cm depth). The ARP would consume 262.5 L/day of 6N sulfuric acid and produce 16.12 kg-N/day ammonium sulfate. The activated sludge system and algal post treatment would produce 900 g-VS/day (or 44.6 L 2% sludge) and 6.83 kg-VS/day (or 341.6 L 2% sludge) of bacterial and algal biomass.

  17. Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?

    PubMed

    Zuberi, Aamir; Lutz, Cathleen

    2016-12-01

    The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling systems, are synergistic and serve to make the mouse a better model for biomedical research, enhancing the potential for preclinical drug discovery and personalized medicine. © The Author 2016. Published by Oxford University Press.

  18. Using Toulmin analysis to analyse an instructor's proof presentation in abstract algebra

    NASA Astrophysics Data System (ADS)

    Fukawa-connelly, Timothy

    2014-01-01

    This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of written models for their work. Similarly, the analysis shows that the details the instructor says aloud differ from what she writes down. Although her verbal commentary provides additional detail and appears to have pedagogical value, for instance, by modelling thinking that supports proof writing, this value might be better realized if she were to change her teaching practices.

  19. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    USGS Publications Warehouse

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  20. The development of an acute care case manager orientation.

    PubMed

    Strzelecki, S; Brobst, R

    1997-01-01

    The authors describe the development of an inpatient acute care case manager orientation in a community hospital. Benner's application of the Dreyfus model of skill acquisition provides the basis for the orientation program. The candidates for the case manager position were expert clinicians. Because of the role change it was projected that they would function as advanced beginners. It was also predicted that, as the case managers progressed within the role, the educational process would need to be adapted to facilitate progression of skills to the proficient level. Feedback from participants reinforced that the model supported the case manager in the role transition. In addition, the model provided a predictive framework for ongoing educational activities.

  1. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  2. HIFOGS: Its design, operations and calibration

    NASA Astrophysics Data System (ADS)

    Witteborn, Fred C.; Cohen, Martin; Bregman, Jesse D.; Heere, Karen R.; Greene, Thomas P.; Wooden, Diane H.

    The High-efficiency, Infrared Faint Object Grating Spectrometer (HIFOGS) provides spectral coverage of selectable portions of the 3 to 18 micron range at resolving powers from 00 to 1000 using 120 Si/Bi detectors. Additional coverage to 30 microns is provided by a bank of 32 Si:P detectors. Selectable apertures, gratings and band-pass filters provide flexibility to this system. Software for operation of HIFOGS and reduction of the data runs on a MacIntosh computer. HIFOGS has been used to establish celestial flux standards using 3 independent approaches: comparison to star models, comparisons to asteroid models and comparisons to laboratory blackbodies. These standards are expected to have wide application in astronomical thermal-infrared spectroscopy.

  3. Improving Hydrological Simulations by Incorporating GRACE Data for Parameter Calibration

    NASA Astrophysics Data System (ADS)

    Bai, P.

    2017-12-01

    Hydrological model parameters are commonly calibrated by observed streamflow data. This calibration strategy is questioned when the modeled hydrological variables of interest are not limited to streamflow. Well-performed streamflow simulations do not guarantee the reliable reproduction of other hydrological variables. One of the reasons is that hydrological model parameters are not reasonably identified. The Gravity Recovery and Climate Experiment (GRACE) satellite-derived total water storage change (TWSC) data provide an opportunity to constrain hydrological model parameterizations in combination with streamflow observations. We constructed a multi-objective calibration scheme based on GRACE-derived TWSC and streamflow observations, with the aim of improving the parameterizations of hydrological models. The multi-objective calibration scheme was compared with the traditional single-objective calibration scheme, which is based only on streamflow observations. Two monthly hydrological models were employed on 22 Chinese catchments with different hydroclimatic conditions. The model evaluation was performed using observed streamflows, GRACE-derived TWSC, and evapotranspiraiton (ET) estimates from flux towers and from the water balance approach. Results showed that the multi-objective calibration provided more reliable TWSC and ET simulations without significant deterioration in the accuracy of streamflow simulations than the single-objective calibration. In addition, the improvements of TWSC and ET simulations were more significant in relatively dry catchments than in relatively wet catchments. This study highlights the importance of including additional constraints besides streamflow observations in the parameter estimation to improve the performances of hydrological models.

  4. Neuropsychiatric SLE: from animal model to human.

    PubMed

    Pikman, R; Kivity, S; Levy, Y; Arango, M-T; Chapman, J; Yonath, H; Shoenfeld, Y; Gofrit, S G

    2017-04-01

    Animal models are a key element in disease research and treatment. In the field of neuropsychiatric lupus research, inbred, transgenic and disease-induced mice provide an opportunity to study the pathogenic routes of this multifactorial illness. In addition to achieving a better understanding of the immune mechanisms underlying the disease onset, supplementary metabolic and endocrine influences have been discovered and investigated. The ever-expanding knowledge about the pathologic events that occur at disease inception enables us to explore new drugs and therapeutic approaches further and to test them using the same animal models. Discovery of the molecular targets that constitute the pathogenic basis of the disease along with scientific advancements allow us to target these molecules with monoclonal antibodies and other specific approaches directly. This novel therapy, termed "targeted biological medication" is a promising endeavor towards producing drugs that are more effective and less toxic. Further work to discover additional molecular targets in lupus' pathogenic mechanism and to produce drugs that neutralize their activity is needed to provide patients with safe and efficient methods of controlling and treating the disease.

  5. Modifications and Modelling of the Fission Surface Power Primary Test Circuit (FSP-PTC)

    NASA Technical Reports Server (NTRS)

    Garber, Ann E.

    2008-01-01

    An actively pumped alkali metal flow circuit, designed and fabricated at the NASA Marshall Space Flight Center, underwent a range of tests at MSFC in early 2007. During this period, system transient responses and the performance of the liquid metal pump were evaluated. In May of 2007, the circuit was drained and cleaned to prepare for multiple modifications: the addition of larger upper and lower reservoirs, the installation of an annular linear induction pump (ALIP), and the inclusion of the Single Flow Cell Test Apparatus (SFCTA) in the test section. Performance of the ALIP, provided by Idaho National Laboratory (INL), will be evaluated when testing resumes. The SFCTA, which will be tested simultaneously, will provide data on alkali metal flow behavior through the simulated core channels and assist in the development of a second generation thermal simulator. Additionally, data from the first round of testing has been used to refine the working system model, developed using the Generalized Fluid System Simulation Program (GFSSP). This paper covers the modifications of the FSP-PTC and the updated GFSSP system model.

  6. Pathway Tools version 19.0 update: software for pathway/genome informatics and systems biology

    PubMed Central

    Latendresse, Mario; Paley, Suzanne M.; Krummenacker, Markus; Ong, Quang D.; Billington, Richard; Kothari, Anamika; Weaver, Daniel; Lee, Thomas; Subhraveti, Pallavi; Spaulding, Aaron; Fulcher, Carol; Keseler, Ingrid M.; Caspi, Ron

    2016-01-01

    Pathway Tools is a bioinformatics software environment with a broad set of capabilities. The software provides genome-informatics tools such as a genome browser, sequence alignments, a genome-variant analyzer and comparative-genomics operations. It offers metabolic-informatics tools, such as metabolic reconstruction, quantitative metabolic modeling, prediction of reaction atom mappings and metabolic route search. Pathway Tools also provides regulatory-informatics tools, such as the ability to represent and visualize a wide range of regulatory interactions. This article outlines the advances in Pathway Tools in the past 5 years. Major additions include components for metabolic modeling, metabolic route search, computation of atom mappings and estimation of compound Gibbs free energies of formation; addition of editors for signaling pathways, for genome sequences and for cellular architecture; storage of gene essentiality data and phenotype data; display of multiple alignments, and of signaling and electron-transport pathways; and development of Python and web-services application programming interfaces. Scientists around the world have created more than 9800 Pathway/Genome Databases by using Pathway Tools, many of which are curated databases for important model organisms. PMID:26454094

  7. The potential application of European market research data in dietary exposure modelling of food additives.

    PubMed

    Tennant, David Robin; Bruyninckx, Chris

    2018-03-01

    Consumer exposure assessments for food additives are incomplete without information about the proportions of foods in each authorised category that contain the additive. Such information has been difficult to obtain but the Mintel Global New Products Database (GNPD) provides information about product launches across Europe over the past 20 years. These data can be searched to identify products with specific additives listed on product labels and the numbers compared with total product launches for food and drink categories in the same database to determine the frequency of occurrence. There are uncertainties associated with the data but these can be managed by adopting a cautious and conservative approach. GNPD data can be mapped with authorised food categories and with food descriptions used in the EFSA Comprehensive European Food Consumption Surveys Database for exposure modelling. The data, when presented as percent occurrence, could be incorporated into the EFSA ANS Panel's 'brand-loyal/non-brand loyal exposure model in a quantitative way. Case studies of preservative, antioxidant, colour and sweetener additives showed that the impact of including occurrence data is greatest in the non-brand loyal scenario. Recommendations for future research include identifying occurrence data for alcoholic beverages, linking regulatory food codes, FoodEx and GNPD product descriptions, developing the use of occurrence data for carry-over foods and improving understanding of brand loyalty in consumer exposure models.

  8. Calibration data Analysis Package (CAP): An IDL based widget application for analysis of X-ray calibration data

    NASA Astrophysics Data System (ADS)

    Vaishali, S.; Narendranath, S.; Sreekumar, P.

    An IDL (interactive data language) based widget application developed for the calibration of C1XS (Narendranath et al., 2010) instrument on Chandrayaan-1 is modified to provide a generic package for the analysis of data from x-ray detectors. The package supports files in ascii as well as FITS format. Data can be fitted with a list of inbuilt functions to derive the spectral redistribution function (SRF). We have incorporated functions such as `HYPERMET' (Philips & Marlow 1976) including non Gaussian components in the SRF such as low energy tail, low energy shelf and escape peak. In addition users can incorporate additional models which may be required to model detector specific features. Spectral fits use a routine `mpfit' which uses Leven-Marquardt least squares fitting method. The SRF derived from this tool can be fed into an accompanying program to generate a redistribution matrix file (RMF) compatible with the X-ray spectral analysis package XSPEC. The tool provides a user friendly interface of help to beginners and also provides transparency and advanced features for experts.

  9. Finite Element Analysis of Active and Sensory Thermopiezoelectric Composite Materials. Degree awarded by Northwestern Univ., Dec. 2000

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    2001-01-01

    Analytical formulations are developed to account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. The coupled response is captured at the material level through the thermopiezoelectric constitutive equations and leads to the inherent capability to model both the sensory and active responses of piezoelectric materials. A layerwise laminate theory is incorporated to provide more accurate analysis of the displacements, strains, stresses, electric fields, and thermal fields through-the-thickness. Thermal effects which arise from coefficient of thermal expansion mismatch, pyroelectric effects, and temperature dependent material properties are explicitly accounted for in the formulation. Corresponding finite element formulations are developed for piezoelectric beam, plate, and shell elements to provide a more generalized capability for the analysis of arbitrary piezoelectric composite structures. The accuracy of the current formulation is verified with comparisons from published experimental data and other analytical models. Additional numerical studies are also conducted to demonstrate additional capabilities of the formulation to represent the sensory and active behaviors. A future plan of experimental studies is provided to characterize the high temperature dynamic response of piezoelectric composite materials.

  10. Real-time slicing algorithm for Stereolithography (STL) CAD model applied in additive manufacturing industry

    NASA Astrophysics Data System (ADS)

    Adnan, F. A.; Romlay, F. R. M.; Shafiq, M.

    2018-04-01

    Owing to the advent of the industrial revolution 4.0, the need for further evaluating processes applied in the additive manufacturing application particularly the computational process for slicing is non-trivial. This paper evaluates a real-time slicing algorithm for slicing an STL formatted computer-aided design (CAD). A line-plane intersection equation was applied to perform the slicing procedure at any given height. The application of this algorithm has found to provide a better computational time regardless the number of facet in the STL model. The performance of this algorithm is evaluated by comparing the results of the computational time for different geometry.

  11. SMP: A solid modeling program version 2.0

    NASA Technical Reports Server (NTRS)

    Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.

    1986-01-01

    The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindsey, Nicholas C.

    The growth of additive manufacturing as a disruptive technology poses nuclear proliferation concerns worthy of serious consideration. Additive manufacturing began in the early 1980s with technological advances in polymer manipulation, computer capabilities, and computer-aided design (CAD) modeling. It was originally limited to rapid prototyping; however, it eventually developed into a complete means of production that has slowly penetrated the consumer market. Today, additive manufacturing machines can produce complex and unique items in a vast array of materials including plastics, metals, and ceramics. These capabilities have democratized the manufacturing industry, allowing almost anyone to produce items as simple as cup holdersmore » or as complex as jet fuel nozzles. Additive manufacturing, or three-dimensional (3D) printing as it is commonly called, relies on CAD files created or shared by individuals with additive manufacturing machines to produce a 3D object from a digital model. This sharing of files means that a 3D object can be scanned or rendered as a CAD model in one country, and then downloaded and printed in another country, allowing items to be shared globally without physically crossing borders. The sharing of CAD files online has been a challenging task for the export controls regime to manage over the years, and additive manufacturing could make these transfers more common. In this sense, additive manufacturing is a disruptive technology not only within the manufacturing industry but also within the nuclear nonproliferation world. This paper provides an overview of additive manufacturing concerns of proliferation.« less

  13. Interfacing the Generalized Fluid System Simulation Program with the SINDA/G Thermal Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Palmiter, Christopher; Farmer, Jeffery; Lycans, Randall; Tiller, Bruce

    2000-01-01

    A general purpose, one dimensional fluid flow code has been interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development was conducted in two phases. This paper describes the first (which allows for steady and quasi-steady - unsteady solid, steady fluid - conjugate heat transfer modeling). The second (full transient conjugate heat transfer modeling) phase of the interface development will be addressed in a later paper. Phase 1 development has been benchmarked to an analytical solution with excellent agreement. Additional test cases for each development phase demonstrate desired features of the interface. The results of the benchmark case, three additional test cases and a practical application are presented herein.

  14. A Comparison of Air Force Data Systems

    DTIC Science & Technology

    1993-08-01

    a software cost model, SPQR . This model was chosen because it provides a straightforward means of modeling the enhancements as they V i VII-25 I would...estimated by SPQR (23,917) by $69 per hour for a total of $1,650,273. An additional 10 percent was added for generating or modifying the Middleware...equipment3 SLOC source lines of code SPO System Program Office SPQR System Product Quality Reporting SSC Standard Systems Center SSI system-to-system

  15. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, Ming‐shu; Whittemore, Donald O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  16. Theory, development, and applicability of the surface water hydrologic model CASC2D

    NASA Astrophysics Data System (ADS)

    Downer, Charles W.; Ogden, Fred L.; Martin, William D.; Harmon, Russell S.

    2002-02-01

    Numerical tests indicate that Hortonian runoff mechanisms benefit from scaling effects that non-Hortonian runoff mechanisms do not share. This potentially makes Hortonian watersheds more amenable to physically based modelling provided that the physically based model employed properly accounts for rainfall distribution and initial soil moisture conditions, to which these types of model are highly sensitive. The distributed Hortonian runoff model CASC2D has been developed and tested for the US Army over the past decade. The purpose of the model is to provide the Army with superior predictions of runoff and stream-flow compared with the standard lumped parameter model HEC-1. The model is also to be used to help minimize negative effects on the landscape caused by US armed forces training activities. Development of the CASC2D model is complete and the model has been tested and applied at several locations. These applications indicate that the model can realistically reproduce hydrographs when properly applied. These applications also indicate that there may be many situations where the model is inadequate. Because of this, the Army is pursuing development of a new model, GSSHA, that will provide improved numerical stability and incorporate additional stream-flow-producing mechanisms and improved hydraulics.

  17. Robust time and frequency domain estimation methods in adaptive control

    NASA Technical Reports Server (NTRS)

    Lamaire, Richard Orville

    1987-01-01

    A robust identification method was developed for use in an adaptive control system. The type of estimator is called the robust estimator, since it is robust to the effects of both unmodeled dynamics and an unmeasurable disturbance. The development of the robust estimator was motivated by a need to provide guarantees in the identification part of an adaptive controller. To enable the design of a robust control system, a nominal model as well as a frequency-domain bounding function on the modeling uncertainty associated with this nominal model must be provided. Two estimation methods are presented for finding parameter estimates, and, hence, a nominal model. One of these methods is based on the well developed field of time-domain parameter estimation. In a second method of finding parameter estimates, a type of weighted least-squares fitting to a frequency-domain estimated model is used. The frequency-domain estimator is shown to perform better, in general, than the time-domain parameter estimator. In addition, a methodology for finding a frequency-domain bounding function on the disturbance is used to compute a frequency-domain bounding function on the additive modeling error due to the effects of the disturbance and the use of finite-length data. The performance of the robust estimator in both open-loop and closed-loop situations is examined through the use of simulations.

  18. Multiverse Space-Antispace Dual Calabi-Yau `Exciplex-Zitterbewegung' Particle Creation

    NASA Astrophysics Data System (ADS)

    Amoroso, Richard L.

    Modeling the `creation/emergence' of matter from spacetime is as old as modern cosmology itself and not without controversy within each model such as Static, Steady-state, Big Bang or Multiverse Continuous-State. In this paper we present only a brief primitive introduction to a new form of `Exciplex-Zitterbewegung' dual space-antispace vacuum Particle Creation applicable especially to Big Bang alternatives which are well-known but ignored; Hubble discovered `Redshift' not a Doppler expansion of the universe which remains the currently popular interpretation. Holographic Anthropic Multiverse cosmology provides viable alternatives to all seemingly sacrosanct pillars of the Big Bang. A model for Multiverse Space-Antispace Dual Calabi-Yau `Exciplex-Zitterbewegung' particle creation has only become possible by incorporating the additional degrees of freedom provided by the capacity complex dimensional extended Yang-Mills Kaluza-Klein correspondence provides.

  19. Developments in Coastal Ocean Modeling

    NASA Astrophysics Data System (ADS)

    Allen, J. S.

    2001-12-01

    Capabilities in modeling continental shelf flow fields have improved markedly in the last several years. Progress is being made toward the long term scientific goal of utilizing numerical circulation models to interpolate, or extrapolate, necessarily limited field measurements to provide additional full-field information describing the behavior of, and providing dynamical rationalizations for, complex observed coastal flow. The improvement in modeling capabilities has been due to several factors including an increase in computer power and, importantly, an increase in experience of modelers in formulating relevant numerical experiments and in analyzing model results. We demonstrate present modeling capabilities and limitations by discussion of results from recent studies of shelf circulation off Oregon and northern California (joint work with Newberger, Gan, Oke, Pullen, and Wijesekera). Strong interactions between wind-forced coastal currents and continental shelf topography characterize the flow regimes in these cases. Favorable comparisons of model and measured alongshore currents and other variables provide confidence in the model-produced fields. The dependence of the mesoscale circulation, including upwelling and downwelling fronts and flow instabilities, on the submodel used to parameterize the effects of small scale turbulence, is discussed. Analyses of model results to provide explanations for the observed, but previously unexplained, alongshore variability in the intensity of coastal upwelling, which typically results in colder surface water south of capes, and the observed development in some locations of northward currents near the coast in response to the relaxation of southward winds, are presented.

  20. Models for nearly every occasion: Part I - One box models.

    PubMed

    Hewett, Paul; Ganser, Gary H

    2017-01-01

    The standard "well mixed room," "one box" model cannot be used to predict occupational exposures whenever the scenario involves the use of local controls. New "constant emission" one box models are proposed that permit either local exhaust or local exhaust with filtered return, coupled with general room ventilation or the recirculation of a portion of the general room exhaust. New "two box" models are presented in Part II of this series. Both steady state and transient models were developed. The steady state equation for each model, including the standard one box steady state model, is augmented with an additional factor reflecting the fraction of time the substance was generated during each task. This addition allows the easy calculation of the average exposure for cyclic and irregular emission patterns, provided the starting and ending concentrations are zero or near zero, or the cumulative time across all tasks is long (e.g., several tasks to a full shift). The new models introduce additional variables, such as the efficiency of the local exhaust to immediately capture freshly generated contaminant and the filtration efficiency whenever filtered exhaust is returned to the workspace. Many of the model variables are knowable (e.g., room volume and ventilation rate). A structured procedure for calibrating a model to a work scenario is introduced that can be applied to both continuous and cyclic processes. The "calibration" procedure generates estimates of the generation rate and all of remaining unknown model variables.

  1. Causal Mediation Analysis of Survival Outcome with Multiple Mediators.

    PubMed

    Huang, Yen-Tsung; Yang, Hwai-I

    2017-05-01

    Mediation analyses have been a popular approach to investigate the effect of an exposure on an outcome through a mediator. Mediation models with multiple mediators have been proposed for continuous and dichotomous outcomes. However, development of multimediator models for survival outcomes is still limited. We present methods for multimediator analyses using three survival models: Aalen additive hazard models, Cox proportional hazard models, and semiparametric probit models. Effects through mediators can be characterized by path-specific effects, for which definitions and identifiability assumptions are provided. We derive closed-form expressions for path-specific effects for the three models, which are intuitively interpreted using a causal diagram. Mediation analyses using Cox models under the rare-outcome assumption and Aalen additive hazard models consider effects on log hazard ratio and hazard difference, respectively; analyses using semiparametric probit models consider effects on difference in transformed survival time and survival probability. The three models were applied to a hepatitis study where we investigated effects of hepatitis C on liver cancer incidence mediated through baseline and/or follow-up hepatitis B viral load. The three methods show consistent results on respective effect scales, which suggest an adverse estimated effect of hepatitis C on liver cancer not mediated through hepatitis B, and a protective estimated effect mediated through the baseline (and possibly follow-up) of hepatitis B viral load. Causal mediation analyses of survival outcome with multiple mediators are developed for additive hazard and proportional hazard and probit models with utility demonstrated in a hepatitis study.

  2. Uncertain programming models for portfolio selection with uncertain returns

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Peng, Jin; Li, Shengguo

    2015-10-01

    In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.

  3. Driver's mental workload prediction model based on physiological indices.

    PubMed

    Yan, Shengyuan; Tran, Cong Chi; Wei, Yingying; Habiyaremye, Jean Luc

    2017-09-15

    Developing an early warning model to predict the driver's mental workload (MWL) is critical and helpful, especially for new or less experienced drivers. The present study aims to investigate the correlation between new drivers' MWL and their work performance, regarding the number of errors. Additionally, the group method of data handling is used to establish the driver's MWL predictive model based on subjective rating (NASA task load index [NASA-TLX]) and six physiological indices. The results indicate that the NASA-TLX and the number of errors are positively correlated, and the predictive model shows the validity of the proposed model with an R 2 value of 0.745. The proposed model is expected to provide a reference value for the new drivers of their MWL by providing the physiological indices, and the driving lesson plans can be proposed to sustain an appropriate MWL as well as improve the driver's work performance.

  4. Modeling and Synthesis Support for the North American Carbon Program

    NASA Astrophysics Data System (ADS)

    Baskaran, L.; Cook, R. B.; Thornton, P. E.; Post, W. M.; Wilson, B. E.; Dadi, U.

    2007-12-01

    The Modeling and Synthesis Thematic Data Center (MAST-DC) supports the North American Carbon Program by providing data products and data management services needed for modeling and synthesis activities. The overall objective of MAST-DC is to provide advanced data management support to NACP investigators doing modeling and synthesis, thereby freeing those investigators from having to perform data management functions. MAST-DC has compiled a number of data products for North America, including sub-pixel land-water content, daily meteorological data, and soil, land cover, and elevation data. In addition, we have developed an internet-based WebGIS system that enables users to browse, query, display, subset, and download spatial data using a standard web browser. For the mid-continent intensive, MAST-DC is working with a group of data assimilation modelers to generate a consistent set of meteorological data to drive bottom-up models.

  5. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    PubMed

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  6. Applications of Metal Additive Manufacturing in Veterinary Orthopedic Surgery

    NASA Astrophysics Data System (ADS)

    Harrysson, Ola L. A.; Marcellin-Little, Denis J.; Horn, Timothy J.

    2015-03-01

    Veterinary medicine has undergone a rapid increase in specialization over the last three decades. Veterinarians now routinely perform joint replacement, neurosurgery, limb-sparing surgery, interventional radiology, radiation therapy, and other complex medical procedures. Many procedures involve advanced imaging and surgical planning. Evidence-based medicine has also become part of the modus operandi of veterinary clinicians. Modeling and additive manufacturing can provide individualized or customized therapeutic solutions to support the management of companion animals with complex medical problems. The use of metal additive manufacturing is increasing in veterinary orthopedic surgery. This review describes and discusses current and potential applications of metal additive manufacturing in veterinary orthopedic surgery.

  7. Impact and Cost-effectiveness of 3 Doses of 9-Valent Human Papillomavirus (HPV) Vaccine Among US Females Previously Vaccinated With 4-Valent HPV Vaccine.

    PubMed

    Chesson, Harrell W; Laprise, Jean-François; Brisson, Marc; Markowitz, Lauri E

    2016-06-01

    We estimated the potential impact and cost-effectiveness of providing 3-doses of nonavalent human papillomavirus (HPV) vaccine (9vHPV) to females aged 13-18 years who had previously completed a series of quadrivalent HPV vaccine (4vHPV), a strategy we refer to as "additional 9vHPV vaccination." We used 2 distinct models: (1) the simplified model, which is among the most basic of the published dynamic HPV models, and (2) the US HPV-ADVISE model, a complex, stochastic, individual-based transmission-dynamic model. When assuming no 4vHPV cross-protection, the incremental cost per quality-adjusted life-year (QALY) gained by additional 9vHPV vaccination was $146 200 in the simplified model and $108 200 in the US HPV-ADVISE model ($191 800 when assuming 4vHPV cross-protection). In 1-way sensitivity analyses in the scenario of no 4vHPV cross-protection, the simplified model results ranged from $70 300 to $182 000, and the US HPV-ADVISE model results ranged from $97 600 to $118 900. The average cost per QALY gained by additional 9vHPV vaccination exceeded $100 000 in both models. However, the results varied considerably in sensitivity and uncertainty analyses. Additional 9vHPV vaccination is likely not as efficient as many other potential HPV vaccination strategies, such as increasing primary 9vHPV vaccine coverage. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  8. Animal Models for Periodontal Disease

    PubMed Central

    Oz, Helieh S.; Puleo, David A.

    2011-01-01

    Animal models and cell cultures have contributed new knowledge in biological sciences, including periodontology. Although cultured cells can be used to study physiological processes that occur during the pathogenesis of periodontitis, the complex host response fundamentally responsible for this disease cannot be reproduced in vitro. Among the animal kingdom, rodents, rabbits, pigs, dogs, and nonhuman primates have been used to model human periodontitis, each with advantages and disadvantages. Periodontitis commonly has been induced by placing a bacterial plaque retentive ligature in the gingival sulcus around the molar teeth. In addition, alveolar bone loss has been induced by inoculation or injection of human oral bacteria (e.g., Porphyromonas gingivalis) in different animal models. While animal models have provided a wide range of important data, it is sometimes difficult to determine whether the findings are applicable to humans. In addition, variability in host responses to bacterial infection among individuals contributes significantly to the expression of periodontal diseases. A practical and highly reproducible model that truly mimics the natural pathogenesis of human periodontal disease has yet to be developed. PMID:21331345

  9. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  10. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  11. Paying for quality not quantity: a wisconsin health maintenance organization proposes an incentive model for reimbursement of chiropractic services.

    PubMed

    Pursel, Kevin J; Jacobson, Martin; Stephenson, Kathy

    2012-07-01

    The purpose of this study is to describe a reimbursement model that was developed by one Health Maintenance Organization (HMO) to transition from fee-for-service to add a combination of pay for performance and reporting model of reimbursement for chiropractic care. The previous incentive program used by the HMO provided best-practice education and additional reimbursement incentives for achieving the National Committee for Quality Assurance Back Pain Recognition Program (NCQA-BPRP) recognition status. However, this model had not leveled costs between doctors of chiropractic (DCs). Therefore, the HMO management aimed to develop a reimbursement model to incentivize providers to embrace existing best-practice models and report existing quality metrics. The development goals included the following: it should (1) be as financially predictable as the previous system, (2) cost no more on a per-member basis, (3) meet the coverage needs of its members, and (4) be able to be operationalized. The model should also reward DCs who embraced best practices with compensation, not simply tied to providing more procedures, the new program needed to (1) cause little or no disruption in current billing, (2) be grounded achievable and defined expectations for improvement in quality, and (3) be voluntary, without being unduly punitive, should the DC choose not to participate in the program. The generated model was named the Comprehensive Chiropractic Quality Reimbursement Methodology (CCQRM; pronounced "Quorum"). In this hybrid model, additional reimbursement, beyond pay-for-procedures will be based on unique payment interpretations reporting selected, existing Physician Quality Reporting System (PQRS) codes, meaningful use of electronic health records, and achieving NCQA-BPRP recognition. This model aims to compensate providers using pay-for-performance, pay-for-quality reporting, pay-for-procedure methods. The CCQRM reimbursement model was developed to address the current needs of one HMO that aims to transition from fee-for-service to a pay-for-performance and quality reporting for reimbursement for chiropractic care. This model is theoretically based on the combination of a fee-for-service payment, pay for participation (NCQA Back Pain Recognition Program payment), meaningful use of electronic health record payment, and pay for reporting (PQRS-BPMG payment). Evaluation of this model needs to be implemented to determine if it will achieve its intended goals. Copyright © 2012 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  12. BioModels Database: a repository of mathematical models of biological processes.

    PubMed

    Chelliah, Vijayalakshmi; Laibe, Camille; Le Novère, Nicolas

    2013-01-01

    BioModels Database is a public online resource that allows storing and sharing of published, peer-reviewed quantitative, dynamic models of biological processes. The model components and behaviour are thoroughly checked to correspond the original publication and manually curated to ensure reliability. Furthermore, the model elements are annotated with terms from controlled vocabularies as well as linked to relevant external data resources. This greatly helps in model interpretation and reuse. Models are stored in SBML format, accepted in SBML and CellML formats, and are available for download in various other common formats such as BioPAX, Octave, SciLab, VCML, XPP and PDF, in addition to SBML. The reaction network diagram of the models is also available in several formats. BioModels Database features a search engine, which provides simple and more advanced searches. Features such as online simulation and creation of smaller models (submodels) from the selected model elements of a larger one are provided. BioModels Database can be accessed both via a web interface and programmatically via web services. New models are available in BioModels Database at regular releases, about every 4 months.

  13. AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.

    PubMed

    Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L

    2016-04-01

    A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.

  14. Medaka as a model for studying environmentally induced epigenetic transgenerational inheritance of phenotypes

    PubMed Central

    2016-01-01

    Abstract Ability of environmental stressors to induce transgenerational diseases has been experimentally demonstrated in plants, worms, fish, and mammals, indicating that exposures affect not only human health but also fish and ecosystem health. Small aquarium fish have been reliable model to study genetic and epigenetic basis of development and disease. Additionally, fish can also provide better, economic opportunity to study transgenerational inheritance of adverse health and epigenetic mechanisms. Molecular mechanisms underlying germ cell development in fish are comparable to those in mammals and humans. This review will provide a short overview of long-term effects of environmental chemical contaminant exposure in various models, associated epigenetic mechanisms, and a perspective on fish as model to study environmentally induced transgenerational inheritance of altered phenotypes. PMID:29492282

  15. Continuous quality improvement: a shared governance model that maximizes agent-specific knowledge.

    PubMed

    Burkoski, Vanessa; Yoon, Jennifer

    2013-01-01

    Motivate, Innovate, Celebrate: an innovative shared governance model through the establishment of continuous quality improvement (CQI) councils was implemented across the London Health Sciences Centre (LHSC). The model leverages agent-specific knowledge at the point of care and provides a structure aimed at building human resources capacity and sustaining enhancements to quality and safe care delivery. Interprofessional and cross-functional teams work through the CQI councils to identify, formulate, execute and evaluate CQI initiatives. In addition to a structure that facilitates collaboration, accountability and ownership, a corporate CQI Steering Committee provides the forum for scaling up and spreading this model. Point-of-care staff, clinical management and educators were trained in LEAN methodology and patient experience-based design to ensure sufficient knowledge and resources to support the implementation.

  16. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    NASA Astrophysics Data System (ADS)

    Hawkins, Donovan Lee

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  17. World energy projection system: Model documentation

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES), provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report.

  18. SBML and CellML translation in antimony and JSim.

    PubMed

    Smith, Lucian P; Butterworth, Erik; Bassingthwaighte, James B; Sauro, Herbert M

    2014-04-01

    The creation and exchange of biologically relevant models is of great interest to many researchers. When multiple standards are in use, models are more readily used and re-used if there exist robust translators between the various accepted formats. Antimony 2.4 and JSim 2.10 provide translation capabilities from their own formats to SBML and CellML. All provided unique challenges, stemming from differences in each format's inherent design, in addition to differences in functionality. Both programs are available under BSD licenses; Antimony from http://antimony.sourceforge.net/and JSim from http://physiome.org/jsim/. lpsmith@u.washington.edu.

  19. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  20. Extensions to the Dynamic Aerospace Vehicle Exchange Markup Language

    NASA Technical Reports Server (NTRS)

    Brian, Geoffrey J.; Jackson, E. Bruce

    2011-01-01

    The Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML) is a syntactical language for exchanging flight vehicle dynamic model data. It provides a framework for encoding entire flight vehicle dynamic model data packages for exchange and/or long-term archiving. Version 2.0.1 of DAVE-ML provides much of the functionality envisioned for exchanging aerospace vehicle data; however, it is limited in only supporting scalar time-independent data. Additional functionality is required to support vector and matrix data, abstracting sub-system models, detailing dynamics system models (both discrete and continuous), and defining a dynamic data format (such as time sequenced data) for validation of dynamics system models and vehicle simulation packages. Extensions to DAVE-ML have been proposed to manage data as vectors and n-dimensional matrices, and record dynamic data in a compatible form. These capabilities will improve the clarity of data being exchanged, simplify the naming of parameters, and permit static and dynamic data to be stored using a common syntax within a single file; thereby enhancing the framework provided by DAVE-ML for exchanging entire flight vehicle dynamic simulation models.

  1. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  2. Effectiveness of Drafting Models for Engineering Technology Students and Impacts on Spatial Visualization Ability: An Analysis and Consideration of Critical Variables

    ERIC Educational Resources Information Center

    Katsioloudis, Petros J.; Stefaniak, Jill E.

    2018-01-01

    Results from a number of studies indicate that the use of drafting models can positively influence the spatial visualization ability for engineering technology students. However, additional variables such as light, temperature, motion and color can play an important role but research provides inconsistent results. Considering this, a set of 5…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.

    The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.

  4. Multi-scale Mexican spotted owl (Strix occidentalis lucida) nest/roost habitat selection in Arizona and a comparison with single-scale modeling results

    Treesearch

    Brad C. Timm; Kevin McGarigal; Samuel A. Cushman; Joseph L. Ganey

    2016-01-01

    Efficacy of future habitat selection studies will benefit by taking a multi-scale approach. In addition to potentially providing increased explanatory power and predictive capacity, multi-scale habitat models enhance our understanding of the scales at which species respond to their environment, which is critical knowledge required to implement effective...

  5. Studies on Vapor Adsorption Systems

    NASA Technical Reports Server (NTRS)

    Shamsundar, N.; Ramotowski, M.

    1998-01-01

    The project consisted of performing experiments on single and dual bed vapor adsorption systems, thermodynamic cycle optimization, and thermal modeling. The work was described in a technical paper that appeared in conference proceedings and a Master's thesis, which were previously submitted to NASA. The present report describes some additional thermal modeling work done subsequently, and includes listings of computer codes developed during the project. Recommendations for future work are provided.

  6. LORAN-C LATITUDE-LONGITUDE CONVERSION AT SEA: PROGRAMMING CONSIDERATIONS.

    USGS Publications Warehouse

    McCullough, James R.; Irwin, Barry J.; Bowles, Robert M.

    1985-01-01

    Comparisons are made of the precision of arc-length routines as computer precision is reduced. Overland propagation delays are discussed and illustrated with observations from offshore New England. Present practice of LORAN-C error budget modeling is then reviewed with the suggestion that additional terms be considered in future modeling. Finally, some detailed numeric examples are provided to help with new computer program checkout.

  7. Cohorts, Communities of Inquiry, and Course Delivery Methods: UTC Best Practices in Learning--The Hybrid Learning Community Model

    ERIC Educational Resources Information Center

    Rausch, David W.; Crawford, Elizabeth K.

    2012-01-01

    From the early 1990s to present, the practice of cohort-based learning has been on the rise in colleges, universities, organizations, and even some K-12 programs across the nation. This type of learning model uses the power of the interpersonal relationships to enhance the learning process and provide additional support to the cohort members as…

  8. A structure-based kinetic model of transcription.

    PubMed

    Zuo, Yuhong; Steitz, Thomas A

    2017-01-01

    During transcription, RNA polymerase moves downstream along the DNA template and maintains a transcription bubble. Several recent structural studies of transcription complexes with a complete transcription bubble provide new insights into how RNAP couples the nucleotide addition reaction to its directional movement.

  9. "Total Deposition (TDEP) Maps"

    EPA Science Inventory

    The presentation provides an update on the use of a hybrid methodology that relies on measured values from national monitoring networks and modeled values from CMAQ to produce of maps of total deposition for use in critical loads and other ecological assessments. Additionally, c...

  10. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list. CONCLUSION: The IMM Database in junction with the IMM is helping NASA aerospace program improve the health care and reduce risk for the astronauts crew. Both the database and model will continue to expand to meet customer needs through its multi-disciplinary evidence based approach to managing data. Future expansion could serve as a platform for a Space Medicine Wiki of medical conditions.

  11. Laser Assisted CVD Growth of A1N and GaN

    DTIC Science & Technology

    1990-08-31

    additional cost sharing. RESEARCH FACILITIES The york is being performed in the Howard University Laser Laboratory. This is a free-standing buildinq...would be used to optimize computer models of the laser induced CVD reactor. FACILITIES AND EQUIPMENT - ADDITIONAL COST SHARING This year Howard ... University has provided $45,000 for the purchase of an excimer laser to be shared by Dr. Crye for the diode laser probe experiments and another Assistant

  12. Modelling of high-frequency structure-borne sound transmission on FEM grids using the Discrete Flow Mapping technique

    NASA Astrophysics Data System (ADS)

    Hartmann, Timo; Tanner, Gregor; Xie, Gang; Chappell, David; Bajars, Janis

    2016-09-01

    Dynamical Energy Analysis (DEA) combined with the Discrete Flow Mapping technique (DFM) has recently been introduced as a mesh-based high frequency method modelling structure borne sound for complex built-up structures. This has proven to enhance vibro-acoustic simulations considerably by making it possible to work directly on existing finite element meshes circumventing time-consuming and costly re-modelling strategies. In addition, DFM provides detailed spatial information about the vibrational energy distribution within a complex structure in the mid-to-high frequency range. We will present here progress in the development of the DEA method towards handling complex FEM-meshes including Rigid Body Elements. In addition, structure borne transmission paths due to spot welds are considered. We will present applications for a car floor structure.

  13. LiDAR based prediction of forest biomass using hierarchical models with spatially varying coefficients

    USGS Publications Warehouse

    Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.

    2015-01-01

    Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.

  14. Endeavour update: a web resource for gene prioritization in multiple species

    PubMed Central

    Tranchevent, Léon-Charles; Barriot, Roland; Yu, Shi; Van Vooren, Steven; Van Loo, Peter; Coessens, Bert; De Moor, Bart; Aerts, Stein; Moreau, Yves

    2008-01-01

    Endeavour (http://www.esat.kuleuven.be/endeavourweb; this web site is free and open to all users and there is no login requirement) is a web resource for the prioritization of candidate genes. Using a training set of genes known to be involved in a biological process of interest, our approach consists of (i) inferring several models (based on various genomic data sources), (ii) applying each model to the candidate genes to rank those candidates against the profile of the known genes and (iii) merging the several rankings into a global ranking of the candidate genes. In the present article, we describe the latest developments of Endeavour. First, we provide a web-based user interface, besides our Java client, to make Endeavour more universally accessible. Second, we support multiple species: in addition to Homo sapiens, we now provide gene prioritization for three major model organisms: Mus musculus, Rattus norvegicus and Caenorhabditis elegans. Third, Endeavour makes use of additional data sources and is now including numerous databases: ontologies and annotations, protein–protein interactions, cis-regulatory information, gene expression data sets, sequence information and text-mining data. We tested the novel version of Endeavour on 32 recent disease gene associations from the literature. Additionally, we describe a number of recent independent studies that made use of Endeavour to prioritize candidate genes for obesity and Type II diabetes, cleft lip and cleft palate, and pulmonary fibrosis. PMID:18508807

  15. Evaluation of the groundwater-flow model for the Ohio River alluvial aquifer near Carrollton, Kentucky, updated to conditions in September 2010

    USGS Publications Warehouse

    Unthank, Michael D.

    2013-01-01

    The Ohio River alluvial aquifer near Carrollton, Ky., is an important water resource for the cities of Carrollton and Ghent, as well as for several industries in the area. The groundwater of the aquifer is the primary source of drinking water in the region and a highly valued natural resource that attracts various water-dependent industries because of its quantity and quality. This report evaluates the performance of a numerical model of the groundwater-flow system in the Ohio River alluvial aquifer near Carrollton, Ky., published by the U.S. Geological Survey in 1999. The original model simulated conditions in November 1995 and was updated to simulate groundwater conditions estimated for September 2010. The files from the calibrated steady-state model of November 1995 conditions were imported into MODFLOW-2005 to update the model to conditions in September 2010. The model input files modified as part of this update were the well and recharge files. The design of the updated model and other input files are the same as the original model. The ability of the updated model to match hydrologic conditions for September 2010 was evaluated by comparing water levels measured in wells to those computed by the model. Water-level measurements were available for 48 wells in September 2010. Overall, the updated model underestimated the water levels at 36 of the 48 measured wells. The average difference between measured water levels and model-computed water levels was 3.4 feet and the maximum difference was 10.9 feet. The root-mean-square error of the simulation was 4.45 for all 48 measured water levels. The updated steady-state model could be improved by introducing more accurate and site-specific estimates of selected field parameters, refined model geometry, and additional numerical methods. Collection of field data to better estimate hydraulic parameters, together with continued review of available data and information from area well operators, could provide the model with revised estimates of conductance values for the riverbed and valley wall, hydraulic conductivities for the model layer, and target water levels for future simulations. Additional model layers, a redesigned model grid, and revised boundary conditions could provide a better framework for more accurate simulations. Additional numerical methods would identify possible parameter estimates and determine parameter sensitivities.

  16. Using a Time-Driven Activity-Based Costing Model To Determine the Actual Cost of Services Provided by a Transgenic Core.

    PubMed

    Gerwin, Philip M; Norinsky, Rada M; Tolwani, Ravi J

    2018-03-01

    Laboratory animal programs and core laboratories often set service rates based on cost estimates. However, actual costs may be unknown, and service rates may not reflect the actual cost of services. Accurately evaluating the actual costs of services can be challenging and time-consuming. We used a time-driven activity-based costing (ABC) model to determine the cost of services provided by a resource laboratory at our institution. The time-driven approach is a more efficient approach to calculating costs than using a traditional ABC model. We calculated only 2 parameters: the time required to perform an activity and the unit cost of the activity based on employee cost. This method allowed us to rapidly and accurately calculate the actual cost of services provided, including microinjection of a DNA construct, microinjection of embryonic stem cells, embryo transfer, and in vitro fertilization. We successfully implemented a time-driven ABC model to evaluate the cost of these services and the capacity of labor used to deliver them. We determined how actual costs compared with current service rates. In addition, we determined that the labor supplied to conduct all services (10,645 min/wk) exceeded the practical labor capacity (8400 min/wk), indicating that the laboratory team was highly efficient and that additional labor capacity was needed to prevent overloading of the current team. Importantly, this time-driven ABC approach allowed us to establish a baseline model that can easily be updated to reflect operational changes or changes in labor costs. We demonstrated that a time-driven ABC model is a powerful management tool that can be applied to other core facilities as well as to entire animal programs, providing valuable information that can be used to set rates based on the actual cost of services and to improve operating efficiency.

  17. Beef Species Symposium: an assessment of the 1996 Beef NRC: metabolizable protein supply and demand and effectiveness of model performance prediction of beef females within extensive grazing systems.

    PubMed

    Waterman, R C; Caton, J S; Löest, C A; Petersen, M K; Roberts, A J

    2014-07-01

    Interannual variation of forage quantity and quality driven by precipitation events influence beef livestock production systems within the Southern and Northern Plains and Pacific West, which combined represent 60% (approximately 17.5 million) of the total beef cows in the United States. The beef cattle requirements published by the NRC are an important tool and excellent resource for both professionals and producers to use when implementing feeding practices and nutritional programs within the various production systems. The objectives of this paper include evaluation of the 1996 Beef NRC model in terms of effectiveness in predicting extensive range beef cow performance within arid and semiarid environments using available data sets, identifying model inefficiencies that could be refined to improve the precision of predicting protein supply and demand for range beef cows, and last, providing recommendations for future areas of research. An important addition to the current Beef NRC model would be to allow users to provide region-specific forage characteristics and the ability to describe supplement composition, amount, and delivery frequency. Beef NRC models would then need to be modified to account for the N recycling that occurs throughout a supplementation interval and the impact that this would have on microbial efficiency and microbial protein supply. The Beef NRC should also consider the role of ruminal and postruminal supply and demand of specific limiting AA. Additional considerations should include the partitioning effects of nitrogenous compounds under different physiological production stages (e.g., lactation, pregnancy, and periods of BW loss). The intent of information provided is to aid revision of the Beef NRC by providing supporting material for changes and identifying gaps in existing scientific literature where future research is needed to enhance the predictive precision and application of the Beef NRC models.

  18. Construction of a multimodal CT-video chest model

    NASA Astrophysics Data System (ADS)

    Byrnes, Patrick D.; Higgins, William E.

    2014-03-01

    Bronchoscopy enables a number of minimally invasive chest procedures for diseases such as lung cancer and asthma. For example, using the bronchoscope's continuous video stream as a guide, a physician can navigate through the lung airways to examine general airway health, collect tissue samples, or administer a disease treatment. In addition, physicians can now use new image-guided intervention (IGI) systems, which draw upon both three-dimensional (3D) multi-detector computed tomography (MDCT) chest scans and bronchoscopic video, to assist with bronchoscope navigation. Unfortunately, little use is made of the acquired video stream, a potentially invaluable source of information. In addition, little effort has been made to link the bronchoscopic video stream to the detailed anatomical information given by a patient's 3D MDCT chest scan. We propose a method for constructing a multimodal CT-video model of the chest. After automatically computing a patient's 3D MDCT-based airway-tree model, the method next parses the available video data to generate a positional linkage between a sparse set of key video frames and airway path locations. Next, a fusion/mapping of the video's color mucosal information and MDCT-based endoluminal surfaces is performed. This results in the final multimodal CT-video chest model. The data structure constituting the model provides a history of those airway locations visited during bronchoscopy. It also provides for quick visual access to relevant sections of the airway wall by condensing large portions of endoscopic video into representative frames containing important structural and textural information. When examined with a set of interactive visualization tools, the resulting fused data structure provides a rich multimodal data source. We demonstrate the potential of the multimodal model with both phantom and human data.

  19. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  20. Tree Biomass Allocation and Its Model Additivity for Casuarina equisetifolia in a Tropical Forest of Hainan Island, China.

    PubMed

    Xue, Yang; Yang, Zhongyang; Wang, Xiaoyan; Lin, Zhipan; Li, Dunxi; Su, Shaofeng

    2016-01-01

    Casuarina equisetifolia is commonly planted and used in the construction of coastal shelterbelt protection in Hainan Island. Thus, it is critical to accurately estimate the tree biomass of Casuarina equisetifolia L. for forest managers to evaluate the biomass stock in Hainan. The data for this work consisted of 72 trees, which were divided into three age groups: young forest, middle-aged forest, and mature forest. The proportion of biomass from the trunk significantly increased with age (P<0.05). However, the biomass of the branch and leaf decreased, and the biomass of the root did not change. To test whether the crown radius (CR) can improve biomass estimates of C. equisetifolia, we introduced CR into the biomass models. Here, six models were used to estimate the biomass of each component, including the trunk, the branch, the leaf, and the root. In each group, we selected one model among these six models for each component. The results showed that including the CR greatly improved the model performance and reduced the error, especially for the young and mature forests. In addition, to ensure biomass additivity, the selected equation for each component was fitted as a system of equations using seemingly unrelated regression (SUR). The SUR method not only gave efficient and accurate estimates but also achieved the logical additivity. The results in this study provide a robust estimation of tree biomass components and total biomass over three groups of C. equisetifolia.

  1. Tree Biomass Allocation and Its Model Additivity for Casuarina equisetifolia in a Tropical Forest of Hainan Island, China

    PubMed Central

    Xue, Yang; Yang, Zhongyang; Wang, Xiaoyan; Lin, Zhipan; Li, Dunxi; Su, Shaofeng

    2016-01-01

    Casuarina equisetifolia is commonly planted and used in the construction of coastal shelterbelt protection in Hainan Island. Thus, it is critical to accurately estimate the tree biomass of Casuarina equisetifolia L. for forest managers to evaluate the biomass stock in Hainan. The data for this work consisted of 72 trees, which were divided into three age groups: young forest, middle-aged forest, and mature forest. The proportion of biomass from the trunk significantly increased with age (P<0.05). However, the biomass of the branch and leaf decreased, and the biomass of the root did not change. To test whether the crown radius (CR) can improve biomass estimates of C. equisetifolia, we introduced CR into the biomass models. Here, six models were used to estimate the biomass of each component, including the trunk, the branch, the leaf, and the root. In each group, we selected one model among these six models for each component. The results showed that including the CR greatly improved the model performance and reduced the error, especially for the young and mature forests. In addition, to ensure biomass additivity, the selected equation for each component was fitted as a system of equations using seemingly unrelated regression (SUR). The SUR method not only gave efficient and accurate estimates but also achieved the logical additivity. The results in this study provide a robust estimation of tree biomass components and total biomass over three groups of C. equisetifolia. PMID:27002822

  2. Future directions for LDEF ionizing radiation modeling and assessments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    A calculational program utilizing data from radiation dosimetry measurements aboard the Long Duration Exposure Facility (LDEF) satellite to reduce the uncertainties in current models defining the ionizing radiation environment is in progress. Most of the effort to date has been on using LDEF radiation dose measurements to evaluate models defining the geomagnetically trapped radiation, which has provided results applicable to radiation design assessments being performed for Space Station Freedom. Plans for future data comparisons, model evaluations, and assessments using additional LDEF data sets (LET spectra, induced radioactivity, and particle spectra) are discussed.

  3. Modeling the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Mitchell, Michael David; Shirah, Donald N.

    The National Infrastructure Simulations and Analysis Center (NISAC) has developed a nationwide model of the Internet to study the potential impact of the loss of physical facilities on the network and on other infrastructures that depend on the Internet for services. The model looks at the Internet from the perspective of Internet Service Providers (ISPs) and their connectivity and can be used to determine how the network connectivity could be modified to assist in mitigating an event. In addition the model could be used to explore how portions of the network could be made more resilient to disruptive events.

  4. Empowering Effective STEM Role Models to Promote STEM Equity in Local Communities

    NASA Astrophysics Data System (ADS)

    Harte, T.; Taylor, J.

    2017-12-01

    Empowering Effective STEM Role Models, a three-hour training developed and successfully implemented by NASA Langley Research Center's Science Directorate, is an effort to encourage STEM professionals to serve as role models within their community. The training is designed to help participants reflect on their identity as a role model and provide research-based strategies to effectively engage youth, particularly girls, in STEM (science, technology, engineering, and mathematics). Research shows that even though girls and boys do not demonstrate a significant difference in their ability to be successful in mathematics and science, there is a significant difference in their confidence level when participating in STEM subject matter and pursuing STEM careers. The Langley training model prepares professionals to disrupt this pattern and take on the habits and skills of effective role models. The training model is based on other successful models and resources for role modeling in STEM including SciGirls; the National Girls Collaborative; and publications by the American Association of University Women and the National Academies. It includes a significant reflection component, and participants walk through situation-based scenarios to practice a focused suite of research-based strategies. These strategies can be implemented in a variety of situations and adapted to the needs of groups that are underrepresented in STEM fields. Underpinning the training and the discussions is the fostering of a growth mindset and promoting perseverance. "The Power of Yet" becomes a means whereby role models encourage students to believe in themselves, working toward reaching their goals and dreams in the area of STEM. To provide additional support, NASA Langley role model trainers are available to work with a champion at other organizations to facilitate the training. This champion helps recruit participants, seeks leadership buy-in, and helps provide valuable insights for needs and interests specific to the organization. After the in-person training experience, participants receive additional follow-up support by working with their local champions and the NASA Langley trainers. The goal is to share the role model training model in an effort to empower STEM role models and assist in promoting STEM Equity in all communities.

  5. Transportation Sector Model of the National Energy Modeling System. Volume 2 -- Appendices: Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The attachments contained within this appendix provide additional details about the model development and estimation process which do not easily lend themselves to incorporation in the main body of the model documentation report. The information provided in these attachments is not integral to the understanding of the model`s operation, but provides the reader with opportunity to gain a deeper understanding of some of the model`s underlying assumptions. There will be a slight degree of replication of materials found elsewhere in the documentation, made unavoidable by the dictates of internal consistency. Each attachment is associated with a specific component of themore » transportation model; the presentation follows the same sequence of modules employed in Volume 1. The following attachments are contained in Appendix F: Fuel Economy Model (FEM)--provides a discussion of the FEM vehicle demand and performance by size class models; Alternative Fuel Vehicle (AFV) Model--describes data input sources and extrapolation methodologies; Light-Duty Vehicle (LDV) Stock Model--discusses the fuel economy gap estimation methodology; Light Duty Vehicle Fleet Model--presents the data development for business, utility, and government fleet vehicles; Light Commercial Truck Model--describes the stratification methodology and data sources employed in estimating the stock and performance of LCT`s; Air Travel Demand Model--presents the derivation of the demographic index, used to modify estimates of personal travel demand; and Airborne Emissions Model--describes the derivation of emissions factors used to associate transportation measures to levels of airborne emissions of several pollutants.« less

  6. The timbre model

    NASA Astrophysics Data System (ADS)

    Jensen, Kristoffer

    2002-11-01

    A timbre model is proposed for use in multiple applications. This model, which encompasses all voiced isolated musical instruments, has an intuitive parameter set, fixed size, and separates the sounds in dimensions akin to the timbre dimensions as proposed in timbre research. The analysis of the model parameters is fully documented, and it proposes, in particular, a method for the estimation of the difficult decay/release split-point. The main parameters of the model are the spectral envelope, the attack/release durations and relative amplitudes, and the inharmonicity and the shimmer and jitter (which provide both for the slow random variations of the frequencies and amplitudes, and also for additive noises). Some of the applications include synthesis, where a real-time application is being developed with an intuitive gui, classification, and search of sounds based on the content of the sounds, and a further understanding of acoustic musical instrument behavior. In order to present the background of the model, this presentation will start with sinusoidal A/S, some timbre perception research, then present the timbre model, show the validity for individual music instrument sounds, and finally introduce some expression additions to the model.

  7. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    NASA Astrophysics Data System (ADS)

    Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom

    2017-06-01

    We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  8. Acquisition, Processing, and Analysis of Continuous Multi-Offset GPR Data for Problems in Hydrogeophysics: Is it Worth the Cost? (Invited)

    NASA Astrophysics Data System (ADS)

    Bradford, J. H.

    2009-12-01

    Commercial development of multi-channel ground-penetrating radar (GPR) systems has made acquisition of continuous multi-offset (CMO) data more cost effective than ever. However, additional operator training, equipment costs, field and analysis time, and computation requirements necessarily remain substantially higher than conventional fixed offset GPR surveys. The choice to conduct a CMO survey is a target driven optimization problem where in many cases the added value outweighs the additional cost. Drawing examples from surface water, groundwater, snow, and glacier hydrology, I demonstrate a range of information that can be derived from CMO data with particular emphasis on estimating material properties of relevance to hydrological problems. Careful data acquisition is key to accurate property measurements. CMO geometries can be constructed with a single-channel system although with a significant loss of time and personnel efficiency relative to modern multi-channel systems. Using procedures such as common-midpoint stacking and pre-stack velocity filtering, it is possible to substantially improve the signal-to-noise ratio in GPR reflection images. However, the primary advantage of CMO data is dense sampling of a wide aperture of travelpaths through the subsurface. These data provide the basis for applying tomographic imaging techniques. Reflection velocity tomography in the pre-stack migration domain provides a robust approach to constructing accurate and detailed electromagnetic velocity models. These models in turn are used in conjunction with petrophysical models to estimate hydrologic properties such as porosity. Additionally, we can utilize the velocity models in conjunction with analysis of the frequency dependent attenuation to evaluate real and complex dielectric permittivity. The real and complex components of dielectric permittivity may have differing sensitivity to different components of the hydrologic system. Understanding this behavior may lead to improved understanding of relevant lithologic properties such as bulk clay content or fluid chemical composition during biodegradation of hydrocarbon contaminants. In addition to velocity tomography, CMO data enable reflection attenuation difference tomography. While time-lapse attenuation difference tomography using crosswell GPR transmission data is a well established technique for imaging conductive tracers in groundwater systems, it is not common for reflection data. Numerical examples based on a realistic aquifer model show that surface data can provide resolution of conductive tracer zones that is comparable to cross well data, thereby minimizing the need for invasive and expensive boreholes.

  9. Average inactivity time model, associated orderings and reliability properties

    NASA Astrophysics Data System (ADS)

    Kayid, M.; Izadkhah, S.; Abouammoh, A. M.

    2018-02-01

    In this paper, we introduce and study a new model called 'average inactivity time model'. This new model is specifically applicable to handle the heterogeneity of the time of the failure of a system in which some inactive items exist. We provide some bounds for the mean average inactivity time of a lifespan unit. In addition, we discuss some dependence structures between the average variable and the mixing variable in the model when original random variable possesses some aging behaviors. Based on the conception of the new model, we introduce and study a new stochastic order. Finally, to illustrate the concept of the model, some interesting reliability problems are reserved.

  10. Amplification of postwildfire peak flow by debris

    NASA Astrophysics Data System (ADS)

    Kean, J. W.; McGuire, L. A.; Rengers, F. K.; Smith, J. B.; Staley, D. M.

    2016-08-01

    In burned steeplands, the peak depth and discharge of postwildfire runoff can substantially increase from the addition of debris. Yet methods to estimate the increase over water flow are lacking. We quantified the potential amplification of peak stage and discharge using video observations of postwildfire runoff, compiled data on postwildfire peak flow (Qp), and a physically based model. Comparison of flood and debris flow data with similar distributions in drainage area (A) and rainfall intensity (I) showed that the median runoff coefficient (C = Qp/AI) of debris flows is 50 times greater than that of floods. The striking increase in Qp can be explained using a fully predictive model that describes the additional flow resistance caused by the emergence of coarse-grained surge fronts. The model provides estimates of the amplification of peak depth, discharge, and shear stress needed for assessing postwildfire hazards and constraining models of bedrock incision.

  11. Combustion properties of Kraft Black Liquors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, W.J. Jr.; Hupa, M.

    1993-04-01

    In a previous study of the phenomena involved in the combustion of black liquor droplets a numerical model was developed. The model required certain black liquor specific combustion information which was then not currently available, and additional data were needed for evaluating the model. The overall objectives of the project reported here was to provide experimental data on key aspects of black liquor combustion, to interpret the data, and to put it into a form which would be useful for computational models for recovery boilers. The specific topics to be investigated were the volatiles and char carbon yields from pyrolysismore » of single black liquor droplets; a criterion for the onset of devolatilization and the accompanying rapid swelling; and the surface temperature of black liquor droplets during pyrolysis, combustion, and gasification. Additional information on the swelling characteristics of black liquor droplets was also obtained as part of the experiments conducted.« less

  12. Pharmacokinetics of topically applied pilocarpine in the albino rabbit eye.

    PubMed

    Makoid, M C; Robinson, J R

    1979-04-01

    The temporal and spatial pattern of [3H]-pilocarpine nitrate distribution in the albino rabbit eye following topical administration was determined. A four-compartment caternary chain model describing this disposition corresponds to the precorneal area, the cornea, the aqueous humor, and the lens and vitreous. Simultaneous computer fitting of data from tissue corresponding to some compartments in the model supported the proposed model. Additional support was provided by the excellent correlation between predicted and observed values in multiple-dosing studies. Several important aspects of ocular drug disposition are evident from the model. The extensive parallel elimination at the absorption site gives rise to an apparent absorption rate constant that is one to two orders of magnitude larger than the true absorption rate constant. In addition, aqueous flow accounts for most of the drug removal. Thus, major effects on absorption and elimination, independent of the drug structure, suggest the possibility of similar pharmacokinetics for vastly different drugs.

  13. The influence of a time-varying least squares parametric model when estimating SFOAEs evoked with swept-frequency tones

    NASA Astrophysics Data System (ADS)

    Hajicek, Joshua J.; Selesnick, Ivan W.; Henin, Simon; Talmadge, Carrick L.; Long, Glenis R.

    2018-05-01

    Stimulus frequency otoacoustic emissions (SFOAEs) were evoked and estimated using swept-frequency tones with and without the use of swept suppressor tones. SFOAEs were estimated using a least-squares fitting procedure. The estimated SFOAEs for the two paradigms (with- and without-suppression) were similar in amplitude and phase. The fitting procedure minimizes the square error between a parametric model of total ear-canal pressure (with unknown amplitudes and phases) and ear-canal pressure acquired during each paradigm. Modifying the parametric model to allow SFOAE amplitude and phase to vary over time revealed additional amplitude and phase fine structure in the without-suppressor, but not the with-suppressor paradigm. The use of a time-varying parametric model to estimate SFOAEs without-suppression may provide additional information about cochlear mechanics not available when using a with-suppressor paradigm.

  14. Regression analysis of informative current status data with the additive hazards model.

    PubMed

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  15. Hydrodynamics with strength: scaling-invariant solutions for elastic-plastic cavity expansion models

    NASA Astrophysics Data System (ADS)

    Albright, Jason; Ramsey, Scott; Baty, Roy

    2017-11-01

    Spherical cavity expansion (SCE) models are used to describe idealized detonation and high-velocity impact in a variety of materials. The common theme in SCE models is the presence of a pressure-driven cavity or void within a domain comprised of plastic and elastic response sub-regions. In past work, the yield criterion characterizing material strength in the plastic sub-region is usually taken for granted and assumed to take a known functional form restrictive to certain classes of materials, e.g. ductile metals or brittle geologic materials. Our objective is to systematically determine a general functional form for the yield criterion under the additional requirement that the SCE admits a similarity solution. Solutions determined under this additional requirement have immediate implications toward development of new compressible flow algorithm verification test problems. However, more importantly, these results also provide novel insight into modeling the yield criteria from the perspective of hydrodynamic scaling.

  16. Amplification of postwildfire peak flow by debris

    USGS Publications Warehouse

    Kean, Jason W.; McGuire, Luke; Rengers, Francis K.; Smith, Joel B.; Staley, Dennis M.

    2016-01-01

    In burned steeplands, the peak depth and discharge of postwildfire runoff can substantially increase from the addition of debris. Yet methods to estimate the increase over water flow are lacking. We quantified the potential amplification of peak stage and discharge using video observations of postwildfire runoff, compiled data on postwildfire peak flow (Qp), and a physically based model. Comparison of flood and debris flow data with similar distributions in drainage area (A) and rainfall intensity (I) showed that the median runoff coefficient (C = Qp/AI) of debris flows is 50 times greater than that of floods. The striking increase in Qp can be explained using a fully predictive model that describes the additional flow resistance caused by the emergence of coarse-grained surge fronts. The model provides estimates of the amplification of peak depth, discharge, and shear stress needed for assessing postwildfire hazards and constraining models of bedrock incision.

  17. Quantitative assessment of Vulnerability of Forest ecosystem to Climate Change in Korea

    NASA Astrophysics Data System (ADS)

    Byun, J.; Lee, W.; Choi, S.; Oh, S.; Climate Change Model Team

    2011-12-01

    The purpose of this study was to assess the vulnerability of forest ecosystem to climate change in Korea using outputs of vegetation models(HyTAG and MC1) and socio-ecological indicators. Also it suggested adaptation strategies in forest management through analysis of three vulnerability components: exposure, sensitivity and adaptive capacity. For the model simulation of past years(1971-2000), the climatic data was prepared by the Korea Meteorological Administration(KMA). In addition, for the future simulation, the Fifth-Generation NCAR/Penn State Mesoscale Model(MM5) coupling with atmosphere-ocean circulation model(ECHO-G) provide the future climatic data under the A1B scenarios. HyTAG (Hydrological and Thermal Analogy Groups), korean model of forest distribution on a regional-scale, could show extent of sensitivity and adaptive capacity in connection with changing frequency and changing direction of vegetation. MC1 model could provide variation and direction of NPP(Net Primary Production) and SCS(Soil Carbon Storage). In addition, the sensitivity and adaptation capacity were evaluated for each. Besides indicators from models, many other indicators such as financial affairs and number of officers were included in the vulnerability components. As a result of the vulnerability assessment, south western part and Je-ju island of Korea had relatively high vulnerability. This finding is considered to come from a distinctively adaptative capacity. Using these results, we could propose actions against climate change and develop decision making systems on forest management.

  18. Improving measurement of injection drug risk behavior using item response theory.

    PubMed

    Janulis, Patrick

    2014-03-01

    Recent research highlights the multiple steps to preparing and injecting drugs and the resultant viral threats faced by drug users. This research suggests that more sensitive measurement of injection drug HIV risk behavior is required. In addition, growing evidence suggests there are gender differences in injection risk behavior. However, the potential for differential item functioning between genders has not been explored. To explore item response theory as an improved measurement modeling technique that provides empirically justified scaling of injection risk behavior and to examine for potential gender-based differential item functioning. Data is used from three studies in the National Institute on Drug Abuse's Criminal Justice Drug Abuse Treatment Studies. A two-parameter item response theory model was used to scale injection risk behavior and logistic regression was used to examine for differential item functioning. Item fit statistics suggest that item response theory can be used to scale injection risk behavior and these models can provide more sensitive estimates of risk behavior. Additionally, gender-based differential item functioning is present in the current data. Improved measurement of injection risk behavior using item response theory should be encouraged as these models provide increased congruence between construct measurement and the complexity of injection-related HIV risk. Suggestions are made to further improve injection risk behavior measurement. Furthermore, results suggest direct comparisons of composite scores between males and females may be misleading and future work should account for differential item functioning before comparing levels of injection risk behavior.

  19. College Campus Community Readiness to Address Intimate Partner Violence Among LGBTQ+ Young Adults: A Conceptual and Empirical Examination.

    PubMed

    Edwards, Katie M; Littleton, Heather L; Sylaska, Kateryna M; Crossman, Annie L; Craig, Meghan

    2016-09-01

    This paper provides an overview of a conceptual model that integrates theories of social ecology, minority stress, and community readiness to better understand risk for and outcomes of intimate partner violence (IPV) among LGBTQ+ college students. Additionally, online survey data was collected from a sample of 202 LGBTQ+ students enrolled in 119 colleges across the United States to provide preliminary data on some aspects of the proposed model. Results suggested that students generally thought their campuses were low in readiness to address IPV; that is, students felt that their campuses could do more to address IPV and provide IPV services specific to LGBTQ+ college students. Perceptions of greater campus readiness to address IPV among LGBTQ+ college students was significantly and positively related to a more favorable LGBTQ+ campus climate and a greater sense of campus community. Additionally, IPV victims were more likely to perceive higher levels of campus community readiness than non-IPV victims. There was no association between IPV perpetration and perceptions of campus community readiness. Greater sense of community was marginally and inversely related to IPV victimization and perpetration. Sense of community and LGBTQ+ campus climate also varied to some extent as a function of region of the country and type of institution. Implications for further development and refinement of the conceptual model, as well as future research applying this model to better understand IPV among sexual minority students are discussed. © Society for Community Research and Action 2016.

  20. Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J; Newes, Emily K

    2017-12-05

    The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less

  1. Evolution of solidification texture during additive manufacturing

    PubMed Central

    Wei, H. L.; Mazumder, J.; DebRoy, T.

    2015-01-01

    Striking differences in the solidification textures of a nickel based alloy owing to changes in laser scanning pattern during additive manufacturing are examined based on theory and experimental data. Understanding and controlling texture are important because it affects mechanical and chemical properties. Solidification texture depends on the local heat flow directions and competitive grain growth in one of the six <100> preferred growth directions in face centered cubic alloys. Therefore, the heat flow directions are examined for various laser beam scanning patterns based on numerical modeling of heat transfer and fluid flow in three dimensions. Here we show that numerical modeling can not only provide a deeper understanding of the solidification growth patterns during the additive manufacturing, it also serves as a basis for customizing solidification textures which are important for properties and performance of components. PMID:26553246

  2. Evolution of solidification texture during additive manufacturing

    DOE PAGES

    Wei, H. L.; Mazumder, J.; DebRoy, T.

    2015-11-10

    Striking differences in the solidification textures of a nickel based alloy owing to changes in laser scanning pattern during additive manufacturing are examined based on theory and experimental data. Understanding and controlling texture are important because it affects mechanical and chemical properties. Solidification texture depends on the local heat flow directions and competitive grain growth in one of the six <100> preferred growth directions in face centered cubic alloys. Furthermore, the heat flow directions are examined for various laser beam scanning patterns based on numerical modeling of heat transfer and fluid flow in three dimensions. Here we show that numericalmore » modeling can not only provide a deeper understanding of the solidification growth patterns during the additive manufacturing, it also serves as a basis for customizing solidification textures which are important for properties and performance of components.« less

  3. Effect of Additional Incentives for Aviation Biofuels: Results from the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J; Newes, Emily K

    The National Renewable Energy Laboratory supported the Department of Energy, Bioenergy Technologies Office, with analysis of alternative jet fuels in collaboration with the U.S. Department of Transportation, Federal Aviation Administration. Airlines for America requested additional exploratory scenarios within FAA analytic framework. Airlines for America requested additional analysis using the same analytic framework, the Biomass Scenario Model. The results were presented at a public working meeting of the California Air Resources Board on including alternative jet fuel in the Low Carbon Fuel Standard on March 17, 2017 (https://www.arb.ca.gov/fuels/lcfs/lcfs_meetings/lcfs_meetings.htm). This presentation clarifies and annotates the slides from the public working meeting, andmore » provides a link to the full data set. NREL does not advocate for or against the policies analyzed in this study.« less

  4. Optimal policies of non-cross-resistant chemotherapy on Goldie and Coldman's cancer model.

    PubMed

    Chen, Jeng-Huei; Kuo, Ya-Hui; Luh, Hsing Paul

    2013-10-01

    Mathematical models can be used to study the chemotherapy on tumor cells. Especially, in 1979, Goldie and Coldman proposed the first mathematical model to relate the drug sensitivity of tumors to their mutation rates. Many scientists have since referred to this pioneering work because of its simplicity and elegance. Its original idea has also been extended and further investigated in massive follow-up studies of cancer modeling and optimal treatment. Goldie and Coldman, together with Guaduskas, later used their model to explain why an alternating non-cross-resistant chemotherapy is optimal with a simulation approach. Subsequently in 1983, Goldie and Coldman proposed an extended stochastic based model and provided a rigorous mathematical proof to their earlier simulation work when the extended model is approximated by its quasi-approximation. However, Goldie and Coldman's analytic study of optimal treatments majorly focused on a process with symmetrical parameter settings, and presented few theoretical results for asymmetrical settings. In this paper, we recast and restate Goldie, Coldman, and Guaduskas' model as a multi-stage optimization problem. Under an asymmetrical assumption, the conditions under which a treatment policy can be optimal are derived. The proposed framework enables us to consider some optimal policies on the model analytically. In addition, Goldie, Coldman and Guaduskas' work with symmetrical settings can be treated as a special case of our framework. Based on the derived conditions, this study provides an alternative proof to Goldie and Coldman's work. In addition to the theoretical derivation, numerical results are included to justify the correctness of our work. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Using Data-Driven Model-Brain Mappings to Constrain Formal Models of Cognition

    PubMed Central

    Borst, Jelmer P.; Nijboer, Menno; Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John R.

    2015-01-01

    In this paper we propose a method to create data-driven mappings from components of cognitive models to brain regions. Cognitive models are notoriously hard to evaluate, especially based on behavioral measures alone. Neuroimaging data can provide additional constraints, but this requires a mapping from model components to brain regions. Although such mappings can be based on the experience of the modeler or on a reading of the literature, a formal method is preferred to prevent researcher-based biases. In this paper we used model-based fMRI analysis to create a data-driven model-brain mapping for five modules of the ACT-R cognitive architecture. We then validated this mapping by applying it to two new datasets with associated models. The new mapping was at least as powerful as an existing mapping that was based on the literature, and indicated where the models were supported by the data and where they have to be improved. We conclude that data-driven model-brain mappings can provide strong constraints on cognitive models, and that model-based fMRI is a suitable way to create such mappings. PMID:25747601

  6. Use of MERRA-2 in the National Solar Radiation Database and Beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Lopez, Anthony; Habte, Aron

    The National Solar Radiation Database (NSRDB) is a flagship product of NREL that provides solar radiation and ancillary meteorological information through a GIS based portal. This data is provided at a 4kmx4km spatial and 30 minute temporal resolution covering the period between 1998-2015. The gridded data that is distributed by the NSRDB is derived from satellite measurements using the Physical Solar Model (PSM) that contains a 2-stage approach. This 2-stage approach consists of first retrieving cloud properties using measurement from the GOES series of satellites and using that information in a radiative transfer model to estimate solar radiation at themore » surface. In addition to the satellite data the model requires ancillary meteorological information that is provided mainly by NASA's Modern Era Retrospecitve Analysis for Research and Applications (MERRA-2) 2 model output. This presentation provides an insight into how the NSRDB is developed using the PSM and how the various sources of data including the MERRA-2 data is used during the process.« less

  7. Efficient and Extensible Quasi-Explicit Modular Nonlinear Multiscale Battery Model: GH-MSMD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Gi-Heon; Smith, Kandler; Lawrence-Simon, Jake

    Complex physics and long computation time hinder the adoption of computer aided engineering models in the design of large-format battery cells and systems. A modular, efficient battery simulation model -- the multiscale multidomain (MSMD) model -- was previously introduced to aid the scale-up of Li-ion material and electrode designs to complete cell and pack designs, capturing electrochemical interplay with 3-D electronic current pathways and thermal response. Here, this paper enhances the computational efficiency of the MSMD model using a separation of time-scales principle to decompose model field variables. The decomposition provides a quasi-explicit linkage between the multiple length-scale domains andmore » thus reduces time-consuming nested iteration when solving model equations across multiple domains. In addition to particle-, electrode- and cell-length scales treated in the previous work, the present formulation extends to bus bar- and multi-cell module-length scales. We provide example simulations for several variants of GH electrode-domain models.« less

  8. Efficient and Extensible Quasi-Explicit Modular Nonlinear Multiscale Battery Model: GH-MSMD

    DOE PAGES

    Kim, Gi-Heon; Smith, Kandler; Lawrence-Simon, Jake; ...

    2017-03-24

    Complex physics and long computation time hinder the adoption of computer aided engineering models in the design of large-format battery cells and systems. A modular, efficient battery simulation model -- the multiscale multidomain (MSMD) model -- was previously introduced to aid the scale-up of Li-ion material and electrode designs to complete cell and pack designs, capturing electrochemical interplay with 3-D electronic current pathways and thermal response. Here, this paper enhances the computational efficiency of the MSMD model using a separation of time-scales principle to decompose model field variables. The decomposition provides a quasi-explicit linkage between the multiple length-scale domains andmore » thus reduces time-consuming nested iteration when solving model equations across multiple domains. In addition to particle-, electrode- and cell-length scales treated in the previous work, the present formulation extends to bus bar- and multi-cell module-length scales. We provide example simulations for several variants of GH electrode-domain models.« less

  9. A flow-simulation model of the tidal Potomac River

    USGS Publications Warehouse

    Schaffranek, Raymond W.

    1987-01-01

    A one-dimensional model capable of simulating flow in a network of interconnected channels has been applied to the tidal Potomac River including its major tributaries and embayments between Washington, D.C., and Indian Head, Md. The model can be used to compute water-surface elevations and flow discharges at any of 66 predetermined locations or at any alternative river cross sections definable within the network of channels. In addition, the model can be used to provide tidal-interchange flow volumes and to evaluate tidal excursions and the flushing properties of the riverine system. Comparisons of model-computed results with measured watersurface elevations and discharges demonstrate the validity and accuracy of the model. Tidal-cycle flow volumes computed by the calibrated model have been verified to be within an accuracy of ? 10 percent. Quantitative characteristics of the hydrodynamics of the tidal river are identified and discussed. The comprehensive flow data provided by the model can be used to better understand the geochemical, biological, and other processes affecting the river's water quality.

  10. Evidence for the involvement of a nonlexical route in the repetition of familiar words: A comparison of single and dual route models of auditory repetition.

    PubMed

    Hanley, J Richard; Dell, Gary S; Kay, Janice; Baron, Rachel

    2004-03-01

    In this paper, we attempt to simulate the picture naming and auditory repetition performance of two patients reported by Hanley, Kay, and Edwards (2002), who were matched for picture naming score but who differed significantly in their ability to repeat familiar words. In Experiment 1, we demonstrate that the model of naming and repetition put forward by Foygel and Dell (2000) is better able to accommodate this pattern of performance than the model put forward by Dell, Schwartz, Martin, Saffran, and Gagnon (1997). Nevertheless, Foygel and Dell's model underpredicted the repetition performance of both patients. In Experiment 2, we attempt to simulate their performance using a new dual route model of repetition in which Foygel and Dell's model is augmented by an additional nonlexical repetition pathway. The new model provided a more accurate fit to the real-word repetition performance of both patients. It is argued that the results provide support for dual route models of auditory repetition.

  11. Change-in-ratio methods for estimating population size

    USGS Publications Warehouse

    Udevitz, Mark S.; Pollock, Kenneth H.; McCullough, Dale R.; Barrett, Reginald H.

    2002-01-01

    Change-in-ratio (CIR) methods can provide an effective, low cost approach for estimating the size of wildlife populations. They rely on being able to observe changes in proportions of population subclasses that result from the removal of a known number of individuals from the population. These methods were first introduced in the 1940’s to estimate the size of populations with 2 subclasses under the assumption of equal subclass encounter probabilities. Over the next 40 years, closed population CIR models were developed to consider additional subclasses and use additional sampling periods. Models with assumptions about how encounter probabilities vary over time, rather than between subclasses, also received some attention. Recently, all of these CIR models have been shown to be special cases of a more general model. Under the general model, information from additional samples can be used to test assumptions about the encounter probabilities and to provide estimates of subclass sizes under relaxations of these assumptions. These developments have greatly extended the applicability of the methods. CIR methods are attractive because they do not require the marking of individuals, and subclass proportions often can be estimated with relatively simple sampling procedures. However, CIR methods require a carefully monitored removal of individuals from the population, and the estimates will be of poor quality unless the removals induce substantial changes in subclass proportions. In this paper, we review the state of the art for closed population estimation with CIR methods. Our emphasis is on the assumptions of CIR methods and on identifying situations where these methods are likely to be effective. We also identify some important areas for future CIR research.

  12. Competition among Li+, Na+, K+ and Rb+ Monovalent Ions for DNA in Molecular Dynamics Simulations using the Additive CHARMM36 and Drude Polarizable Force Fields

    PubMed Central

    Savelyev, Alexey; MacKerell, Alexander D.

    2015-01-01

    In the present study we report on interactions of and competition between monovalent ions for two DNA sequences in MD simulations. Efforts included the development and validation of parameters for interactions among the first-group monovalent cations, Li+, Na+, K+ and Rb+, and DNA in the Drude polarizable and additive CHARMM36 force fields (FF). The optimization process targeted gas-phase QM interaction energies of various model compounds with ions and osmotic pressures of bulk electrolyte solutions of chemically relevant ions. The optimized ionic parameters are validated against counterion condensation theory and buffer exchange-atomic emission spectroscopy measurements providing quantitative data on the competitive association of different monovalent ions with DNA. Comparison between experimental and MD simulation results demonstrates that, compared to the additive CHARMM36 model, the Drude FF provides an improved description of the general features of the ionic atmosphere around DNA and leads to closer agreement with experiment on the ionic competition within the ion atmosphere. Results indicate the importance of extended simulation systems on the order of 25 Å beyond the DNA surface to obtain proper convergence of ion distributions. PMID:25751286

  13. Dynamic analysis of gas-core reactor system

    NASA Technical Reports Server (NTRS)

    Turner, K. H., Jr.

    1973-01-01

    A heat transfer analysis was incorporated into a previously developed model CODYN to obtain a model of open-cycle gaseous core reactor dynamics which can predict the heat flux at the cavity wall. The resulting model was used to study the sensitivity of the model to the value of the reactivity coefficients and to determine the system response for twenty specified perturbations. In addition, the model was used to study the effectiveness of several control systems in controlling the reactor. It was concluded that control drums located in the moderator region capable of inserting reactivity quickly provided the best control.

  14. A Survey of Long-Range Forecasting Models and Data Resources: A Method for Their Application at the Department of Defense.

    DTIC Science & Technology

    1979-08-08

    confident analysis or prediction. Still, the behavioralist models do provide a basis for comparison and analysis of real world environments . In addition...p.236. 60 o Environmental - the lowest level and encompasses man’s physical environment (climate, land, water, air, and physical resources); also... analysis . The food model report is based on two postulates: a. It is reasonable to review agriculture in an ecosystems framework *Mesarovic, M., and Pestel

  15. The Primary Care Computer Simulation: Optimal Primary Care Manager Empanelment.

    DTIC Science & Technology

    1997-05-01

    explored in which a team consisted of two providers, two nurses, and a nurse aide . Each team had a specific exam room assigned to them. Additionally, a...team consisting of one provider, one nurse, and one nurse aide was simulated. The model also examined the effects of adding two exam rooms. The study...minutes. The optimal solution, which reduced patient time to below 90 minutes, was the mix of one provider, a nurse, and a nurse aide in which each

  16. Geothermal Case Studies

    DOE Data Explorer

    Young, Katherine

    2014-09-30

    database.) In fiscal year 2015, NREL is working with universities to populate additional case studies on OpenEI. The goal is to provide a large enough dataset to start conducting analyses of exploration programs to identify correlations between successful exploration plans for areas with similar geologic occurrence models.

  17. Extracellular Matrix Biomarkers for Diagnosis, Prognosis, Imaging, and Targeting

    DTIC Science & Technology

    2015-09-01

    collaboration with the Lindquist lab. Funding Support: Please see previously provided other support and changes noted below. Name: Doris Tabassum ...Project: Doris Tabassum has generated cell line models of heterogeneity with different metastatic capability. Additional funds from Dr. Polyak’s grants

  18. Incorporating Psychological Predictors of Treatment Response into Health Economic Simulation Models: A Case Study in Type 1 Diabetes.

    PubMed

    Kruger, Jen; Pollard, Daniel; Basarir, Hasan; Thokala, Praveen; Cooke, Debbie; Clark, Marie; Bond, Rod; Heller, Simon; Brennan, Alan

    2015-10-01

    . Health economic modeling has paid limited attention to the effects that patients' psychological characteristics have on the effectiveness of treatments. This case study tests 1) the feasibility of incorporating psychological prediction models of treatment response within an economic model of type 1 diabetes, 2) the potential value of providing treatment to a subgroup of patients, and 3) the cost-effectiveness of providing treatment to a subgroup of responders defined using 5 different algorithms. . Multiple linear regressions were used to investigate relationships between patients' psychological characteristics and treatment effectiveness. Two psychological prediction models were integrated with a patient-level simulation model of type 1 diabetes. Expected value of individualized care analysis was undertaken. Five different algorithms were used to provide treatment to a subgroup of predicted responders. A cost-effectiveness analysis compared using the algorithms to providing treatment to all patients. . The psychological prediction models had low predictive power for treatment effectiveness. Expected value of individualized care results suggested that targeting education at responders could be of value. The cost-effectiveness analysis suggested, for all 5 algorithms, that providing structured education to a subgroup of predicted responders would not be cost-effective. . The psychological prediction models tested did not have sufficient predictive power to make targeting treatment cost-effective. The psychological prediction models are simple linear models of psychological behavior. Collection of data on additional covariates could potentially increase statistical power. . By collecting data on psychological variables before an intervention, we can construct predictive models of treatment response to interventions. These predictive models can be incorporated into health economic models to investigate more complex service delivery and reimbursement strategies. © The Author(s) 2015.

  19. Improving homology modeling of G-protein coupled receptors through multiple-template derived conserved inter-residue interactions

    NASA Astrophysics Data System (ADS)

    Chaudhari, Rajan; Heim, Andrew J.; Li, Zhijun

    2015-05-01

    Evidenced by the three-rounds of G-protein coupled receptors (GPCR) Dock competitions, improving homology modeling methods of helical transmembrane proteins including the GPCRs, based on templates of low sequence identity, remains an eminent challenge. Current approaches addressing this challenge adopt the philosophy of "modeling first, refinement next". In the present work, we developed an alternative modeling approach through the novel application of available multiple templates. First, conserved inter-residue interactions are derived from each additional template through conservation analysis of each template-target pairwise alignment. Then, these interactions are converted into distance restraints and incorporated in the homology modeling process. This approach was applied to modeling of the human β2 adrenergic receptor using the bovin rhodopsin and the human protease-activated receptor 1 as templates and improved model quality was demonstrated compared to the homology model generated by standard single-template and multiple-template methods. This method of "refined restraints first, modeling next", provides a fast and complementary way to the current modeling approaches. It allows rational identification and implementation of additional conserved distance restraints extracted from multiple templates and/or experimental data, and has the potential to be applicable to modeling of all helical transmembrane proteins.

  20. Computational comparison of quantum-mechanical models for multistep direct reactions

    NASA Astrophysics Data System (ADS)

    Koning, A. J.; Akkermans, J. M.

    1993-02-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmüller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90Zr(p,p') at 80 MeV, 209Bi(p,p') at 62 MeV, and 93Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data.

  1. α-Fluoro-α-nitro(phenylsulfonyl)methane as a fluoromethyl pronucleophile: Efficient stereoselective Michael addition to chalcones

    PubMed Central

    Prakash, G. K. Surya; Wang, Fang; Stewart, Timothy; Mathew, Thomas; Olah, George A.

    2009-01-01

    Highly efficient stereoselective 1,4-addition of racemic α-fluoro-α-nitro(phenylsulfonyl)methane (FNSM) as a fluoromethyl pronucleophile to α,β-unsaturated ketones using a wide range of chiral organobifunctional catalysts under moderate conditions in the absence of an additional base has been achieved. A series of catalysts was screened for the enantioselective addition of FNSM to chalcones and the catalysts CN I, CD I, QN I-IV, and QD I were found to enable this reaction, successfully providing exclusive 1,4-addition products stereoselectively in high yields (conversion, diastereomeric ratio, and enantiomeric excess). Studies involving a model reaction and systematic analysis of the absolute configuration support the suggested mechanism. PMID:19237559

  2. Evaluation of Hydrologic Simulations Developed Using Multi-Model Synthesis and Remotely-Sensed Data within a Portfolio of Calibration Strategies

    NASA Astrophysics Data System (ADS)

    Lafontaine, J.; Hay, L.; Markstrom, S. L.

    2016-12-01

    The United States Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the conterminous United States (CONUS). As many stream reaches in the CONUS are either not gaged, or are substantially impacted by water use or flow regulation, ancillary information must be used to determine reasonable parameter estimations for streamflow simulations. Hydrologic models for 1,576 gaged watersheds across the CONUS were developed to test the feasibility of improving streamflow simulations linking physically-based hydrologic models with remotely-sensed data products (i.e. snow water equivalent). Initially, the physically-based models were calibrated to measured streamflow data to provide a baseline for comparison across multiple calibration strategy tests. In addition, not all ancillary datasets are appropriate for application to all parts of the CONUS (e.g. snow water equivalent in the southeastern U.S., where snow is a rarity). As it is not expected that any one data product or model simulation will be sufficient for representing hydrologic behavior across the entire CONUS, a systematic evaluation of which data products improve hydrologic simulations for various regions across the CONUS was performed. The resulting portfolio of calibration strategies can be used to guide selection of an appropriate combination of modeled and measured information for hydrologic model development and calibration. In addition, these calibration strategies have been developed to be flexible so that new data products can be assimilated. This analysis provides a foundation to understand how well models work when sufficient streamflow data are not available and could be used to further inform hydrologic model parameter development for ungaged areas.

  3. Development and validation of age-dependent FE human models of a mid-sized male thorax.

    PubMed

    El-Jawahri, Raed E; Laituri, Tony R; Ruan, Jesse S; Rouhana, Stephen W; Barbat, Saeed D

    2010-11-01

    The increasing number of people over 65 years old (YO) is an important research topic in the area of impact biomechanics, and finite element (FE) modeling can provide valuable support for related research. There were three objectives of this study: (1) Estimation of the representative age of the previously-documented Ford Human Body Model (FHBM) -- an FE model which approximates the geometry and mass of a mid-sized male, (2) Development of FE models representing two additional ages, and (3) Validation of the resulting three models to the extent possible with respect to available physical tests. Specifically, the geometry of the model was compared to published data relating rib angles to age, and the mechanical properties of different simulated tissues were compared to a number of published aging functions. The FHBM was determined to represent a 53-59 YO mid-sized male. The aforementioned aging functions were used to develop FE models representing two additional ages: 35 and 75 YO. The rib model was validated against human rib specimens and whole rib tests, under different loading conditions, with and without modeled fracture. In addition, the resulting three age-dependent models were validated by simulating cadaveric tests of blunt and sled impacts. The responses of the models, in general, were within the cadaveric response corridors. When compared to peak responses from individual cadavers similar in size and age to the age-dependent models, some responses were within one standard deviation of the test data. All the other responses, but one, were within two standard deviations.

  4. Numerical Simulation of Shock-Dispersed Fuel Charges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bell, John B.; Day, Marcus; Beckner, Vincent

    Successfully attacking underground storage facilities for chemical and biological (C/B) weapons is an important mission area for the Department of Defense. The fate of a C/B agent during an attack depends critically on the pressure and thermal environment that the agent experiences. The initial environment is determined by the blast wave from an explosive device. The byproducts of the detonation provide a fuel source that burn when mixed with oxidizer (after burning). Additional energy can be released by the ignition of the C/B agent as it mixes with the explosion products and the air in the chamber. Hot plumes ventingmore » material from any openings in the chamber can provide fuel for additional energy release when mixed with additional oxidizer. Assessment of the effectiveness of current explosives as well as the development of new explosive systems requires a detailed understanding of all of these modes of energy release. Using methodologies based on the use of higher-order Godunov schemes combined with Adaptive Mesh Refinement (AMR), implemented in a parallel adaptive framework suited to the massively parallel computer systems provided by the DOD High-Performance Computing Modernization program, we use a suite of programs to develop predictive models for the simulation of the energetics of blast waves, deflagration waves and ejecta plumes. The programs use realistic reaction kinetic and thermodynamic models provided by standard components (such as CHEMKIN) as well as other novel methods to model enhanced explosive devices. The work described here focuses on the validation of these models against a series of bomb calorimetry experiments performed at the Ernst-Mach Institute. In this paper, we present three-dimensional simulations of the experiments, examining the explosion dynamics and the role of subsequent burning on the explosion products on the thermal and pressure environment within the calorimeter. The effects of burning are quantified by comparing two sets of computations, one in which the calorimeter is filled with nitrogen so there is no after burning and a second in which the calorimeter contains air.« less

  5. The impact of clustering of extreme European windstorm events on (re)insurance market portfolios

    NASA Astrophysics Data System (ADS)

    Mitchell-Wallace, Kirsten; Alvarez-Diaz, Teresa

    2010-05-01

    Traditionally the occurrence of windstorm loss events in Europe has been considered as independent. However, a number of significant losses close in space and time indicates that this assumption may need to be revised. Under particular atmospheric conditions multiple loss-causing cyclones can occur in succession, affecting similar geographic regions and, therefore, insurance markets. A notable example is of Lothar and Martin in France in December 1999. Although the existence of cyclone families is well-known by meteorologists, there has been limited research into occurrence of serial windstorms. However, climate modelling research is now providing the ability to explore the physical drivers of clustering, and to improve understanding of the hazard aspect of catastrophe modelling. While analytics tools, including catastrophe models, may incorporate assumptions regarding the influence of dependency through statistical means, the most recent research outputs provide a new strand of information with the potential to re-assess the probabilistic loss potential in light of clustering and to provide an additional view on probable maximum losses to windstorm-exposed portfolios across regions such as Northwest Europe. There is however, a need for the testing of these new techniques within operational (re)insurance applications, and this paper provide an overview of the most current clustering research, including the 2009 paper by Vitolo et. al., in relation to reinsurance risk modelling, and to assess the potential impact of such additional information on the overall risk assessment process. We examine the consequences of the serial clustering of extra-tropical cyclones demonstrated by Vitolo et al. (2009) from the perspective of a large European reinsurer, examining potential implications for: • Pricing • Accumulation And • Capital adequacy

  6. Simulation Modeling and Performance Evaluation of Space Networks

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John

    2006-01-01

    In space exploration missions, the coordinated use of spacecraft as communication relays increases the efficiency of the endeavors. To conduct trade-off studies of the performance and resource usage of different communication protocols and network designs, JPL designed a comprehensive extendable tool, the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE). The design and development of MACHETE began in 2000 and is constantly evolving. Currently, MACHETE contains Consultative Committee for Space Data Systems (CCSDS) protocol standards such as Proximity-1, Advanced Orbiting Systems (AOS), Packet Telemetry/Telecommand, Space Communications Protocol Specification (SCPS), and the CCSDS File Delivery Protocol (CFDP). MACHETE uses the Aerospace Corporation s Satellite Orbital Analysis Program (SOAP) to generate the orbital geometry information and contact opportunities. Matlab scripts provide the link characteristics. At the core of MACHETE is a discrete event simulator, QualNet. Delay Tolerant Networking (DTN) is an end-to-end architecture providing communication in and/or through highly stressed networking environments. Stressed networking environments include those with intermittent connectivity, large and/or variable delays, and high bit error rates. To provide its services, the DTN protocols reside at the application layer of the constituent internets, forming a store-and-forward overlay network. The key capabilities of the bundling protocols include custody-based reliability, ability to cope with intermittent connectivity, ability to take advantage of scheduled and opportunistic connectivity, and late binding of names to addresses. In this presentation, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the use of MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions

  7. Mitigating Provider Uncertainty in Service Provision Contracts

    NASA Astrophysics Data System (ADS)

    Smith, Chris; van Moorsel, Aad

    Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.

  8. Forays in flavor

    NASA Astrophysics Data System (ADS)

    Perez, M. Jay

    This dissertation is a summary of four works investigating questions of flavor in the Standard Model and Beyond. Drawing from the ideas of Grand Unification to unify quarks and leptons, the Seesaw Mechanism to explain the generation of neutrino masses, as well as the current data available from flavor observables, a framework called the "Flavor Ring" is introduced. Its aim is to bring as many theoretical tools to bear on the flavor puzzle of the Standard Model, providing additional constraints on Supersymmetric models of flavor which employ family symmetries. It is first applied to the Delta IW = 1/2 mass matrices of the quarks and charged leptons, where we use the additional constraints provided by the flavor ring and the family group Z7 x Z3 to perform a numerical search for phenomenologically allowed down-quark and charged lepton Yukawa matrices. Using the Seesaw Mechanism and relations from SO(10), we then consider the implications of the flavor ring framework on the DeltaIW = 0 Majorana mass M of right-handed neutrinos, and the Supersymmetric mu-mass matrix of a family of Higgs. We find a special form for M which predicts a normal hierarchy and the values of the light neutrino masses, and a mu-matrix with an incredible hierarchy of thirteen orders of magnitude. They are produced naturally by a simple underlying theory invariant under the family symmetry PSL2(7). We close with an examination of what role the additional heavy Higgs flavors may play in phenomenology, exploring a toy model where they serve as the messengers of Supersymmetry breaking.

  9. Graft function assessment in mouse models of single- and dual- kidney transplantation.

    PubMed

    Wang, Lei; Wang, Ximing; Jiang, Shan; Wei, Jin; Buggs, Jacentha; Fu, Liying; Zhang, Jie; Liu, Ruisheng

    2018-05-23

    Animal models of kidney transplantation (KTX) are widely used in studying immune response of hosts to implanted grafts. Additionally, KTX can be used in generating kidney-specific knockout animal models by transplantation of kidneys from donors with global knockout of a gene to wild type recipients or vise verse. Dual kidney transplantation (DKT) provides a more physiological environment for recipients than single kidney transplantation (SKT). However, DKT in mice is rare due to technical challenges. In this study, we successfully performed DKT in mice and compared the hemodynamic response and graft function with SKT. The surgical time, complications and survival rate of DKT were not significantly different from SKT, where survival rates were above 85%. Mice with DKT showed less injury and quicker recovery with lower plasma creatinine (Pcr) and higher GFR than SKT mice (Pcr = 0.34 and 0.17 mg/dl in DKT vs. 0.50 and 0.36 mg/dl in SKT at 1 and 3 days, respectively; GFR = 215 and 131 µl/min for DKT and SKT, respectively). In addition, the DKT exhibited better renal functional reserve and long-term outcome of renal graft function than SKT based on the response to acute volume expansion. In conclusion, we have successfully generated a mouse DKT model. The hemodynamic responses of DKT better mimic physiological situations with less kidney injury and better recovery than SKT because of reduced confounding factors such as single nephron hyperfiltration. We anticipate DKT in mice will provide an additional tool for evaluation of renal significance in physiology and disease.

  10. Modeling climate change impacts on groundwater resources using transient stochastic climatic scenarios

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; BrouyèRe, Serge; Blenkinsop, Stephen; Burton, Aidan; Fowler, Hayley J.; Orban, Philippe; Dassargues, Alain

    2011-12-01

    Several studies have highlighted the potential negative impact of climate change on groundwater reserves, but additional work is required to help water managers plan for future changes. In particular, existing studies provide projections for a stationary climate representative of the end of the century, although information is demanded for the near future. Such time-slice experiments fail to account for the transient nature of climatic changes over the century. Moreover, uncertainty linked to natural climate variability is not explicitly considered in previous studies. In this study we substantially improve upon the state-of-the-art by using a sophisticated transient weather generator in combination with an integrated surface-subsurface hydrological model (Geer basin, Belgium) developed with the finite element modeling software "HydroGeoSphere." This version of the weather generator enables the stochastic generation of large numbers of equiprobable climatic time series, representing transient climate change, and used to assess impacts in a probabilistic way. For the Geer basin, 30 equiprobable climate change scenarios from 2010 to 2085 have been generated for each of six different regional climate models (RCMs). Results show that although the 95% confidence intervals calculated around projected groundwater levels remain large, the climate change signal becomes stronger than that of natural climate variability by 2085. Additionally, the weather generator's ability to simulate transient climate change enabled the assessment of the likely time scale and associated uncertainty of a specific impact, providing managers with additional information when planning further investment. This methodology constitutes a real improvement in the field of groundwater projections under climate change conditions.

  11. Induced seismicity constraints on subsurface geological structure, Paradox Valley, Colorado

    NASA Astrophysics Data System (ADS)

    Block, Lisa V.; Wood, Christopher K.; Yeck, William L.; King, Vanessa M.

    2015-02-01

    Precise relative hypocentres of seismic events induced by long-term fluid injection at the Paradox Valley Unit (PVU) brine disposal well provide constraints on the subsurface geological structure and compliment information available from deep seismic reflection and well data. We use the 3-D spatial distribution of the hypocentres to refine the locations, strikes, and throws of subsurface faults interpre­ted previously from geophysical surveys and to infer the existence of previously unidentified subsurface faults. From distinct epicentre lineations and focal mechanism trends, we identify a set of conjugate fracture orientations consistent with shear-slip reactivation of late-Palaeozoic fractures over a widespread area, as well as an additional fracture orientation present only near the injection well. We propose simple Mohr-Coulomb fracture models to explain these observations. The observation that induced seismicity preferentially occurs along one of the identified conjugate fracture orientations can be explained by a rotation in the direction of the regional maximum compressive stress from the time when the fractures were formed to the present. Shear slip along the third fracture orientation observed near the injection well is inconsistent with the current regional stress field and suggests a local rotation of the horizontal stresses. The detailed subsurface model produced by this analysis provides important insights for anticipating spatial patterns of future induced seismicity and for evaluation of possible additional injection well sites that are likely to be seismically and hydrologically isolated from the current well. In addition, the interpreted fault patterns provide constraints for estimating the maximum magnitude earthquake that may be induced, and for building geomechanical models to simulate pore pressure diffusion, stress changes and earthquake triggering.

  12. From climate model ensembles to climate change impacts and adaptation: A case study of water resource management in the southwest of England

    NASA Astrophysics Data System (ADS)

    Lopez, Ana; Fung, Fai; New, Mark; Watts, Glenn; Weston, Alan; Wilby, Robert L.

    2009-08-01

    The majority of climate change impacts and adaptation studies so far have been based on at most a few deterministic realizations of future climate, usually representing different emissions scenarios. Large ensembles of climate models are increasingly available either as ensembles of opportunity or perturbed physics ensembles, providing a wealth of additional data that is potentially useful for improving adaptation strategies to climate change. Because of the novelty of this ensemble information, there is little previous experience of practical applications or of the added value of this information for impacts and adaptation decision making. This paper evaluates the value of perturbed physics ensembles of climate models for understanding and planning public water supply under climate change. We deliberately select water resource models that are already used by water supply companies and regulators on the assumption that uptake of information from large ensembles of climate models will be more likely if it does not involve significant investment in new modeling tools and methods. We illustrate the methods with a case study on the Wimbleball water resource zone in the southwest of England. This zone is sufficiently simple to demonstrate the utility of the approach but with enough complexity to allow a variety of different decisions to be made. Our research shows that the additional information contained in the climate model ensemble provides a better understanding of the possible ranges of future conditions, compared to the use of single-model scenarios. Furthermore, with careful presentation, decision makers will find the results from large ensembles of models more accessible and be able to more easily compare the merits of different management options and the timing of different adaptation. The overhead in additional time and expertise for carrying out the impacts analysis will be justified by the increased quality of the decision-making process. We remark that even though we have focused our study on a water resource system in the United Kingdom, our conclusions about the added value of climate model ensembles in guiding adaptation decisions can be generalized to other sectors and geographical regions.

  13. Aortic dissection simulation models for clinical support: fluid-structure interaction vs. rigid wall models.

    PubMed

    Alimohammadi, Mona; Sherwood, Joseph M; Karimpour, Morad; Agu, Obiekezie; Balabani, Stavroula; Díaz-Zuccarini, Vanessa

    2015-04-15

    The management and prognosis of aortic dissection (AD) is often challenging and the use of personalised computational models is being explored as a tool to improve clinical outcome. Including vessel wall motion in such simulations can provide more realistic and potentially accurate results, but requires significant additional computational resources, as well as expertise. With clinical translation as the final aim, trade-offs between complexity, speed and accuracy are inevitable. The present study explores whether modelling wall motion is worth the additional expense in the case of AD, by carrying out fluid-structure interaction (FSI) simulations based on a sample patient case. Patient-specific anatomical details were extracted from computed tomography images to provide the fluid domain, from which the vessel wall was extrapolated. Two-way fluid-structure interaction simulations were performed, with coupled Windkessel boundary conditions and hyperelastic wall properties. The blood was modelled using the Carreau-Yasuda viscosity model and turbulence was accounted for via a shear stress transport model. A simulation without wall motion (rigid wall) was carried out for comparison purposes. The displacement of the vessel wall was comparable to reports from imaging studies in terms of intimal flap motion and contraction of the true lumen. Analysis of the haemodynamics around the proximal and distal false lumen in the FSI model showed complex flow structures caused by the expansion and contraction of the vessel wall. These flow patterns led to significantly different predictions of wall shear stress, particularly its oscillatory component, which were not captured by the rigid wall model. Through comparison with imaging data, the results of the present study indicate that the fluid-structure interaction methodology employed herein is appropriate for simulations of aortic dissection. Regions of high wall shear stress were not significantly altered by the wall motion, however, certain collocated regions of low and oscillatory wall shear stress which may be critical for disease progression were only identified in the FSI simulation. We conclude that, if patient-tailored simulations of aortic dissection are to be used as an interventional planning tool, then the additional complexity, expertise and computational expense required to model wall motion is indeed justified.

  14. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  15. Optimization of aeromedical base locations in New Mexico using a model that considers crash nodes and paths.

    PubMed

    Erdemir, Elif Tokar; Batta, Rajan; Spielman, Seth; Rogerson, Peter A; Blatt, Alan; Flanigan, Marie

    2008-05-01

    In a recent paper, Tokar Erdemir et al. (2008) introduce models for service systems with service requests originating from both nodes and paths. We demonstrate how to apply and extend their approach to an aeromedical base location application, with specific focus on the state of New Mexico (NM). The current aeromedical base locations of NM are selected without considering motor vehicle crash paths. Crash paths are the roads on which crashes occur, where each road segment has a weight signifying relative crash occurrence. We analyze the loss in accident coverage and location error for current aeromedical base locations. We also provide insights on the relevance of considering crash paths when selecting aeromedical base locations. Additionally, we look briefly at some of the tradeoff issues in locating additional trauma centers vs. additional aeromedical bases in the current aeromedical system of NM. Not surprisingly, tradeoff analysis shows that by locating additional aeromedical bases, we always attain the required coverage level with a lower cost than with locating additional trauma centers.

  16. XSIM Final Report: Modelling the Past and Future of Identity Management for Scientific Collaborations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowles, Robert; Jackson, Craig; Welch, Von

    The eXtreme Science Identity Management (XSIM1) research project: collected and analyzed real world data on virtual organization (VO) identity management (IdM) representing the last 15+ years of collaborative DOE science; constructed a descriptive VO IdM model based on that data; used the model and existing trends to project the direction for IdM in the 2020 timeframe; and provided guidance to scientific collaborations and resource providers that are implementing or seeking to improve IdM functionality. XSIM conducted over 20 semi­structured interviews of representatives from scientific collaborations and resource providers, both in the US and Europe; the interviewees supported diverse set ofmore » scientific collaborations and disciplines. We developed a definition of “trust,” a key concept in IdM, to understand how varying trust models affect where IdM functions are performed. The model identifies how key IdM data elements are utilized in collaborative scientific workflows, and it has the flexibility to describe past, present and future trust relationships and IdM implementations. During the funding period, we gave more than two dozen presentations to socialize our work, encourage feedback, and improve the model; we also published four refereed papers. Additionally, we developed, presented, and received favorable feedback on three white papers providing practical advice to collaborations and/or resource providers.« less

  17. Evaluation of Parallel-Element, Variable-Impedance, Broadband Acoustic Liner Concepts

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Howerton, Brian M.; Ayle, Earl

    2012-01-01

    Recent trends in aircraft engine design have highlighted the need for acoustic liners that provide broadband sound absorption with reduced liner thickness. Three such liner concepts are evaluated using the NASA normal incidence tube. Two concepts employ additive manufacturing techniques to fabricate liners with variable chamber depths. The first relies on scrubbing losses within narrow chambers to provide acoustic resistance necessary for sound absorption. The second employs wide chambers that provide minimal resistance, and relies on a perforated sheet to provide acoustic resistance. The variable-depth chambers used in both concepts result in reactance spectra near zero. The third liner concept employs mesh-caps (resistive sheets) embedded at variable depths within adjacent honeycomb chambers to achieve a desired impedance spectrum. Each of these liner concepts is suitable for use as a broadband sound absorber design, and a transmission line model is presented that provides good comparison with their respective acoustic impedance spectra. This model can therefore be used to design acoustic liners to accurately achieve selected impedance spectra. Finally, the effects of increasing the perforated facesheet thickness are demonstrated, and the validity of prediction models based on lumped element and wave propagation approaches is investigated. The lumped element model compares favorably with measured results for liners with thin facesheets, but the wave propagation model provides good comparisons for a wide range of facesheet thicknesses.

  18. An Ontology of Quality Initiatives and a Model for Decentralized, Collaborative Quality Management on the (Semantic) World Wide Web

    PubMed Central

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be. PMID:11772549

  19. An ontology of quality initiatives and a model for decentralized, collaborative quality management on the (semantic) World-Wide-Web.

    PubMed

    Eysenbach, G

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be.

  20. Empirical flow parameters : a tool for hydraulic model validity

    USGS Publications Warehouse

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

Top